Mar 13 20:27:47 crc systemd[1]: Starting Kubernetes Kubelet... Mar 13 20:27:47 crc restorecon[4688]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.416921 4790 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422357 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422425 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422437 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422448 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422462 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422475 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422486 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422497 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422507 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422517 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422526 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422537 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422547 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422557 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422567 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422579 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422614 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422622 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422630 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422641 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422653 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422666 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422701 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422715 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422727 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422747 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422762 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422773 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422783 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422793 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422803 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422816 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422826 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422836 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422845 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422855 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422865 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422875 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422883 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422891 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422899 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422906 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422914 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422923 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422931 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422938 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422946 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422954 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422961 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422969 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422976 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422984 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423010 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423019 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423030 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423037 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423045 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423053 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423061 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423068 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423076 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423083 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423091 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423105 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423115 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423125 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423135 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423145 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423155 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423165 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423175 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424082 4790 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424116 4790 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424145 4790 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424161 4790 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424176 4790 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424185 4790 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424197 4790 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424208 4790 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424218 4790 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424227 4790 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424240 4790 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424249 4790 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424258 4790 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424267 4790 flags.go:64] FLAG: --cgroup-root="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424277 4790 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424286 4790 flags.go:64] FLAG: --client-ca-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424295 4790 flags.go:64] FLAG: --cloud-config="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424316 4790 flags.go:64] FLAG: --cloud-provider="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424325 4790 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424347 4790 flags.go:64] FLAG: --cluster-domain="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424355 4790 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424365 4790 flags.go:64] FLAG: --config-dir="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424373 4790 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424419 4790 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424431 4790 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424440 4790 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424450 4790 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424459 4790 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424470 4790 flags.go:64] FLAG: --contention-profiling="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424478 4790 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424488 4790 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424497 4790 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424506 4790 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424517 4790 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424526 4790 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424535 4790 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424544 4790 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424553 4790 flags.go:64] FLAG: --enable-server="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424562 4790 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424583 4790 flags.go:64] FLAG: --event-burst="100" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424593 4790 flags.go:64] FLAG: --event-qps="50" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424603 4790 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424616 4790 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424627 4790 flags.go:64] FLAG: --eviction-hard="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424641 4790 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424652 4790 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424663 4790 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424675 4790 flags.go:64] FLAG: --eviction-soft="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424686 4790 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424696 4790 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424706 4790 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424717 4790 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424728 4790 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424754 4790 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424764 4790 flags.go:64] FLAG: --feature-gates="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424775 4790 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424784 4790 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424794 4790 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424803 4790 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424812 4790 flags.go:64] FLAG: --healthz-port="10248" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424821 4790 flags.go:64] FLAG: --help="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424831 4790 flags.go:64] FLAG: --hostname-override="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424839 4790 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424848 4790 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424857 4790 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424866 4790 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424875 4790 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424884 4790 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424892 4790 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424901 4790 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424910 4790 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424920 4790 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424929 4790 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424939 4790 flags.go:64] FLAG: --kube-reserved="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424948 4790 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424956 4790 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424966 4790 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424975 4790 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424983 4790 flags.go:64] FLAG: --lock-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424992 4790 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425001 4790 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425010 4790 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425027 4790 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425038 4790 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425049 4790 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425059 4790 flags.go:64] FLAG: --logging-format="text" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425071 4790 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425083 4790 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425093 4790 flags.go:64] FLAG: --manifest-url="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425115 4790 flags.go:64] FLAG: --manifest-url-header="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425127 4790 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425137 4790 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425148 4790 flags.go:64] FLAG: --max-pods="110" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425157 4790 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425166 4790 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425177 4790 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425188 4790 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425199 4790 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425212 4790 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425223 4790 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425250 4790 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425260 4790 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425273 4790 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425285 4790 flags.go:64] FLAG: --pod-cidr="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425296 4790 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425313 4790 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425323 4790 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425335 4790 flags.go:64] FLAG: --pods-per-core="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425346 4790 flags.go:64] FLAG: --port="10250" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425357 4790 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425368 4790 flags.go:64] FLAG: --provider-id="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425440 4790 flags.go:64] FLAG: --qos-reserved="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425454 4790 flags.go:64] FLAG: --read-only-port="10255" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425467 4790 flags.go:64] FLAG: --register-node="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425477 4790 flags.go:64] FLAG: --register-schedulable="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425487 4790 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425505 4790 flags.go:64] FLAG: --registry-burst="10" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425516 4790 flags.go:64] FLAG: --registry-qps="5" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425527 4790 flags.go:64] FLAG: --reserved-cpus="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425538 4790 flags.go:64] FLAG: --reserved-memory="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425552 4790 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425563 4790 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425575 4790 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425586 4790 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425598 4790 flags.go:64] FLAG: --runonce="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425627 4790 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425640 4790 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425652 4790 flags.go:64] FLAG: --seccomp-default="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425663 4790 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425675 4790 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425688 4790 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425700 4790 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425712 4790 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425723 4790 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425734 4790 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425745 4790 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425756 4790 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425768 4790 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425780 4790 flags.go:64] FLAG: --system-cgroups="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425793 4790 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425812 4790 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425823 4790 flags.go:64] FLAG: --tls-cert-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425834 4790 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425856 4790 flags.go:64] FLAG: --tls-min-version="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425868 4790 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425879 4790 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425890 4790 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425901 4790 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425913 4790 flags.go:64] FLAG: --v="2" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425928 4790 flags.go:64] FLAG: --version="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425942 4790 flags.go:64] FLAG: --vmodule="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425954 4790 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425966 4790 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426265 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426282 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426295 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426305 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426316 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426329 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426341 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426353 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426420 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426434 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426445 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426455 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426463 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426472 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426480 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426488 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426496 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426503 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426511 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426519 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426527 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426535 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426543 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426551 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426558 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426566 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426574 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426582 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426590 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426599 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426608 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426618 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426636 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426652 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426662 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426721 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426731 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426741 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426751 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426761 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426771 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426781 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426791 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426801 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426815 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426825 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426836 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426846 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426856 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426873 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426892 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426904 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426918 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426930 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426942 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426953 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426963 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426973 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426983 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426993 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427003 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427014 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427024 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427033 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427045 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427054 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427064 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427082 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427091 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427099 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427107 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.427120 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.439243 4790 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.439282 4790 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439365 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439392 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439399 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439404 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439408 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439412 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439417 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439423 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439429 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439435 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439440 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439444 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439449 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439454 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439460 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439465 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439470 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439475 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439479 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439483 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439487 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439492 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439497 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439501 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439506 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439509 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439515 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439524 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439529 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439533 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439537 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439544 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439549 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439554 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439559 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439565 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439570 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439574 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439579 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439584 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439589 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439594 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439598 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439603 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439607 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439613 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439619 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439624 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439628 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439632 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439637 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439641 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439646 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439650 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439654 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439658 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439662 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439666 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439671 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439674 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439678 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439682 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439686 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439691 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439697 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439701 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439706 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439710 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439714 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439718 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439723 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.439731 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439925 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439935 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439941 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439947 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439952 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439959 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439964 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439969 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439973 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439978 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439982 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439988 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439994 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439999 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440004 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440009 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440015 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440021 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440026 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440033 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440038 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440043 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440048 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440053 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440058 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440063 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440067 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440072 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440076 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440080 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440084 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440088 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440091 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440095 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440099 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440103 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440107 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440112 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440115 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440119 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440125 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440129 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440132 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440137 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440140 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440144 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440148 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440151 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440155 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440159 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440163 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440167 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440171 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440176 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440180 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440184 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440188 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440192 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440196 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440200 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440203 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440208 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440212 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440216 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440220 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440224 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440228 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440232 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440236 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440240 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440244 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.440250 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.440488 4790 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.443741 4790 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.448758 4790 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.448933 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.451513 4790 server.go:997] "Starting client certificate rotation" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.451575 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.451990 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.474994 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.477567 4790 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.478784 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.493755 4790 log.go:25] "Validated CRI v1 runtime API" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.527496 4790 log.go:25] "Validated CRI v1 image API" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.531771 4790 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.537791 4790 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-13-20-23-01-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.537831 4790 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.561348 4790 manager.go:217] Machine: {Timestamp:2026-03-13 20:27:49.558223126 +0000 UTC m=+0.579339057 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e656ddb5-8fa2-4c70-bd3f-f718d29b7550 BootID:ddb77a45-6df3-4ccf-8361-682222076454 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9f:1d:06 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9f:1d:06 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4d:d1:84 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4c:81:52 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:20:f9:ca Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f0:0d:ac Speed:-1 Mtu:1496} {Name:eth10 MacAddress:52:65:0e:75:1e:5a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:39:71:d8:37:c7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.561667 4790 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.561823 4790 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.566489 4790 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.566781 4790 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.566823 4790 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567139 4790 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567159 4790 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567788 4790 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567836 4790 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.568047 4790 state_mem.go:36] "Initialized new in-memory state store" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.568625 4790 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577409 4790 kubelet.go:418] "Attempting to sync node with API server" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577443 4790 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577548 4790 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577575 4790 kubelet.go:324] "Adding apiserver pod source" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577596 4790 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.583633 4790 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.584658 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.584655 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.584785 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.584787 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.585825 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.587999 4790 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589721 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589749 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589759 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589768 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589781 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589790 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589802 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589819 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589829 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589838 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589880 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589890 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.595519 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.596051 4790 server.go:1280] "Started kubelet" Mar 13 20:27:49 crc systemd[1]: Started Kubernetes Kubelet. Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599433 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599794 4790 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599784 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599800 4790 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.600042 4790 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.600086 4790 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.599952 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.600128 4790 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599820 4790 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.600318 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.607314 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.607470 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.607967 4790 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608623 4790 server.go:460] "Adding debug handlers to kubelet server" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608778 4790 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608829 4790 factory.go:55] Registering systemd factory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608846 4790 factory.go:221] Registration of the systemd container factory successfully Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609399 4790 factory.go:153] Registering CRI-O factory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609433 4790 factory.go:221] Registration of the crio container factory successfully Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609466 4790 factory.go:103] Registering Raw factory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609487 4790 manager.go:1196] Started watching for new ooms in manager Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.608419 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.610984 4790 manager.go:319] Starting recovery of all containers Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623126 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623232 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623267 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623333 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623352 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623371 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623416 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623435 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623458 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623530 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623553 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623572 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623589 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623685 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623735 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623766 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623790 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623813 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623831 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623852 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623870 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623888 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623907 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623925 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623963 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623980 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624003 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624023 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624041 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624058 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624110 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624137 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624155 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624172 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624189 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624207 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628332 4790 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628435 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628461 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628479 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628494 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628509 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628522 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628539 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628554 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628571 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628585 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628600 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628616 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628631 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628647 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628661 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628674 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628697 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628711 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628726 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628742 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628757 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628772 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628786 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628800 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628814 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628828 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628846 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628860 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628876 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628889 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628905 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628923 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628939 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628952 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628964 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628976 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628988 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629003 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629015 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629026 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629042 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629056 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629069 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629085 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629098 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629111 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629124 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629139 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629151 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629163 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629201 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629212 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629223 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629234 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629260 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629272 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629284 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629297 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629314 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629326 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629340 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629354 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629367 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629400 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629416 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629428 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629487 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629501 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629522 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629537 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629551 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629570 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629583 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629597 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629612 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629626 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629641 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629653 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629665 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629678 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629690 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629704 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629716 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629730 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629749 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629762 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629774 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629786 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629797 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629808 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629822 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629834 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629847 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629857 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629869 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629882 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629895 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629913 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629924 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629937 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629949 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629963 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629974 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629986 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630002 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630014 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630027 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630039 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630051 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630064 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630078 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630091 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630103 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630115 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630127 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630142 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630155 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630170 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630185 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630201 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630214 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630226 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630240 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630250 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630262 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630274 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630283 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630293 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630304 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630315 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630325 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630335 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630359 4790 manager.go:324] Recovery completed Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630399 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630549 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630599 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630609 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630628 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630637 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630648 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630658 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630668 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630679 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630691 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630702 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630713 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630724 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630737 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630752 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630764 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630776 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630790 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630802 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630815 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630828 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630840 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630852 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630865 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630876 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630887 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630898 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630909 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630920 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630931 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630941 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630953 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630963 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630973 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630983 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630994 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631004 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631017 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631026 4790 reconstruct.go:97] "Volume reconstruction finished" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631033 4790 reconciler.go:26] "Reconciler: start to sync state" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.638838 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.640741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.640786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.640798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.644347 4790 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.644367 4790 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.644409 4790 state_mem.go:36] "Initialized new in-memory state store" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.656535 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.658487 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.658572 4790 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.658612 4790 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.658668 4790 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.659449 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.659538 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.662175 4790 policy_none.go:49] "None policy: Start" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.663092 4790 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.663128 4790 state_mem.go:35] "Initializing new in-memory state store" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.700656 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.708512 4790 manager.go:334] "Starting Device Plugin manager" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.708737 4790 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.708761 4790 server.go:79] "Starting device plugin registration server" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709188 4790 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709210 4790 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709385 4790 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709564 4790 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709574 4790 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.715848 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.758790 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.758954 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761721 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762823 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762860 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762928 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763202 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763282 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763439 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764305 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765010 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765123 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765273 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765734 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765828 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765874 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765899 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766023 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766085 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766326 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766401 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766626 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766970 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767126 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.801755 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.809945 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811299 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811333 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811345 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811391 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.811844 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832827 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833105 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833174 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833199 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833240 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935346 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935406 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935507 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935590 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935624 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935740 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935860 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.936020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.936021 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.936120 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.012475 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014389 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014426 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.015260 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.110030 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.130822 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.139275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.157343 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4 WatchSource:0}: Error finding container 6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4: Status 404 returned error can't find the container with id 6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4 Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.158550 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.163558 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa WatchSource:0}: Error finding container 010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa: Status 404 returned error can't find the container with id 010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.164146 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174 WatchSource:0}: Error finding container b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174: Status 404 returned error can't find the container with id b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174 Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.164828 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.172205 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152 WatchSource:0}: Error finding container 988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152: Status 404 returned error can't find the container with id 988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152 Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.202483 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.426437 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428770 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.429419 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.598400 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.598514 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.601094 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.663870 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7eb1cafcefd5398e40c79482db9ff3626d16ce0f27e093e72f6093252fb76e4e"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.664987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.666046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.666954 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.667899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4"} Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.704223 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.704328 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.897272 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.897446 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.919904 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.920022 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:51 crc kubenswrapper[4790]: E0313 20:27:51.003328 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.229807 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232426 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232480 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:51 crc kubenswrapper[4790]: E0313 20:27:51.233466 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.601670 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.673092 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.674076 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.673858 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.673674 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b"} Mar 13 20:27:51 crc kubenswrapper[4790]: E0313 20:27:51.675948 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.677874 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.677947 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.677985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.679784 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.679884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.680018 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.680443 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.681677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.681708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.681718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.682679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.682722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.682740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.684475 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.684522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.684612 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.686374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.686631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.686856 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.689734 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.689795 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.689901 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.690968 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.691018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.691037 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698099 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698144 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698353 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.703337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.703419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.703441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.601959 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.604578 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="3.2s" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.699360 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708203 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708291 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.712599 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d" exitCode=0 Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.712679 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.712869 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.714042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.714085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.714098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.716528 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.716522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.717312 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.717348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.717360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720525 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720545 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720573 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721636 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.815641 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.833807 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835534 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.836258 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:52 crc kubenswrapper[4790]: W0313 20:27:52.891915 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.892021 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:52 crc kubenswrapper[4790]: W0313 20:27:52.907238 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.907341 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.363313 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.660583 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.726711 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0"} Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.726830 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.732679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.732779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.732796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.734829 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364" exitCode=0 Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.734935 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.734977 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.735002 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.735014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364"} Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.735064 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736126 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736312 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736345 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736529 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.213209 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.313500 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742277 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742298 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742236 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742277 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743117 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743492 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743610 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743625 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.745226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.745258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.745268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749193 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a"} Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749307 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749329 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749979 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750432 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.751244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.751260 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.751266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.875644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.963118 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.037091 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038593 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038625 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.363594 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.363770 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.756925 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.758071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.758130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.758142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.759574 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.760472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.760506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.760517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.418960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.419136 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.420542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.420774 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.420899 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:59 crc kubenswrapper[4790]: E0313 20:27:59.716090 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.230949 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.231229 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.233113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.233193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.233232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.236442 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.768926 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.770600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.770661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.770686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.774064 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.771447 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.772821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.772872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.772884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:02 crc kubenswrapper[4790]: I0313 20:28:02.802353 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 13 20:28:02 crc kubenswrapper[4790]: I0313 20:28:02.802497 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.601872 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.661567 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.661686 4790 trace.go:236] Trace[542972085]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 20:27:53.660) (total time: 10001ms): Mar 13 20:28:03 crc kubenswrapper[4790]: Trace[542972085]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:28:03.661) Mar 13 20:28:03 crc kubenswrapper[4790]: Trace[542972085]: [10.00115726s] [10.00115726s] END Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.661718 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.861706 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.873570 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.877412 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.878691 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.878750 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.878832 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.878910 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.879503 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.883281 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.883396 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.884329 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.884427 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.885267 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.885352 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.219027 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]log ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]etcd ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-filter ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-informers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-controllers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/crd-informer-synced ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-system-namespaces-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 13 20:28:04 crc kubenswrapper[4790]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/bootstrap-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-kube-aggregator-informers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-registration-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-discovery-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]autoregister-completion ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapi-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: livez check failed Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.219120 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.605016 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:04Z is after 2026-02-23T05:33:13Z Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.781564 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.783287 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0" exitCode=255 Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.783338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0"} Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.783512 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.784425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.784472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.784482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.785086 4790 scope.go:117] "RemoveContainer" containerID="6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.403346 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.403730 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.405549 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.405595 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.405609 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.433975 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.605864 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:05Z is after 2026-02-23T05:33:13Z Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.790574 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.792758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41"} Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.792908 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.792992 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.793750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.793778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.793791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.794254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.794282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.794294 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.810363 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.364977 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.365087 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.604426 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:06Z is after 2026-02-23T05:33:13Z Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.797688 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.798679 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801044 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" exitCode=255 Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41"} Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801245 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801317 4790 scope.go:117] "RemoveContainer" containerID="6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801453 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.802084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.802113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.802128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803149 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803161 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803750 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:06 crc kubenswrapper[4790]: E0313 20:28:06.803923 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.605732 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:07Z is after 2026-02-23T05:33:13Z Mar 13 20:28:07 crc kubenswrapper[4790]: W0313 20:28:07.620972 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:07Z is after 2026-02-23T05:33:13Z Mar 13 20:28:07 crc kubenswrapper[4790]: E0313 20:28:07.621259 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.778890 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.804661 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.806679 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.807635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.807671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.807681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.808191 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:07 crc kubenswrapper[4790]: E0313 20:28:07.808427 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:08 crc kubenswrapper[4790]: W0313 20:28:08.152848 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:08Z is after 2026-02-23T05:33:13Z Mar 13 20:28:08 crc kubenswrapper[4790]: E0313 20:28:08.152943 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:08 crc kubenswrapper[4790]: I0313 20:28:08.603934 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:08Z is after 2026-02-23T05:33:13Z Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.225650 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.225857 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.227441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.227539 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.227566 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.228808 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:09 crc kubenswrapper[4790]: E0313 20:28:09.229093 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.241910 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.603037 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:09Z is after 2026-02-23T05:33:13Z Mar 13 20:28:09 crc kubenswrapper[4790]: E0313 20:28:09.716187 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.812280 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.813477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.813608 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.813754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.814673 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:09 crc kubenswrapper[4790]: E0313 20:28:09.815008 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.280093 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:10 crc kubenswrapper[4790]: E0313 20:28:10.281434 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:10Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281976 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:10 crc kubenswrapper[4790]: E0313 20:28:10.284810 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 20:28:10 crc kubenswrapper[4790]: W0313 20:28:10.521534 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 20:28:10 crc kubenswrapper[4790]: E0313 20:28:10.521609 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.605110 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:11 crc kubenswrapper[4790]: I0313 20:28:11.609076 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.605137 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.646176 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.664932 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.801163 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.801518 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.803537 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.803611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.803637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.804633 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:12 crc kubenswrapper[4790]: E0313 20:28:12.805000 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:13 crc kubenswrapper[4790]: I0313 20:28:13.611061 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.870583 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.877614 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.884610 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.891833 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.899058 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086cbde6881 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.711456385 +0000 UTC m=+0.732572276,LastTimestamp:2026-03-13 20:27:49.711456385 +0000 UTC m=+0.732572276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.907097 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.76148253 +0000 UTC m=+0.782598421,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.914206 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.761506491 +0000 UTC m=+0.782622382,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.921182 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.761518122 +0000 UTC m=+0.782634013,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.926189 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.762744317 +0000 UTC m=+0.783860208,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.933691 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.762773659 +0000 UTC m=+0.783889550,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.939156 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.762783159 +0000 UTC m=+0.783899050,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.946690 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.763454249 +0000 UTC m=+0.784570130,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.953909 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.76346871 +0000 UTC m=+0.784584601,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.960199 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.76347707 +0000 UTC m=+0.784592961,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.967049 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.764287418 +0000 UTC m=+0.785403309,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.972528 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.764301768 +0000 UTC m=+0.785417659,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.976649 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.764311739 +0000 UTC m=+0.785427630,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.981230 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.76499689 +0000 UTC m=+0.786112781,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.986497 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.765017041 +0000 UTC m=+0.786132932,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.991447 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.765027531 +0000 UTC m=+0.786143432,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.998782 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.765724283 +0000 UTC m=+0.786840174,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.005728 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.765740313 +0000 UTC m=+0.786856204,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.010588 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.765751034 +0000 UTC m=+0.786866925,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.014674 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.76588529 +0000 UTC m=+0.787001181,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.019053 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.765904731 +0000 UTC m=+0.787020622,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.028025 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8086e6e80380 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.16507072 +0000 UTC m=+1.186186611,LastTimestamp:2026-03-13 20:27:50.16507072 +0000 UTC m=+1.186186611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.033884 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8086e7061516 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.167041302 +0000 UTC m=+1.188157193,LastTimestamp:2026-03-13 20:27:50.167041302 +0000 UTC m=+1.188157193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.040983 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8086e70de4e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.167553256 +0000 UTC m=+1.188669177,LastTimestamp:2026-03-13 20:27:50.167553256 +0000 UTC m=+1.188669177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.045729 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c8086e7c760b0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.179709104 +0000 UTC m=+1.200824995,LastTimestamp:2026-03-13 20:27:50.179709104 +0000 UTC m=+1.200824995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.051533 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c8086e82a08b7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.186174647 +0000 UTC m=+1.207290538,LastTimestamp:2026-03-13 20:27:50.186174647 +0000 UTC m=+1.207290538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.057532 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80870b739e4f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.778199631 +0000 UTC m=+1.799315552,LastTimestamp:2026-03-13 20:27:50.778199631 +0000 UTC m=+1.799315552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.064667 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80870b85f55c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.779401564 +0000 UTC m=+1.800517455,LastTimestamp:2026-03-13 20:27:50.779401564 +0000 UTC m=+1.800517455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.071570 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870b91ae38 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.780169784 +0000 UTC m=+1.801285715,LastTimestamp:2026-03-13 20:27:50.780169784 +0000 UTC m=+1.801285715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.077792 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80870ba3c3f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.781354997 +0000 UTC m=+1.802470888,LastTimestamp:2026-03-13 20:27:50.781354997 +0000 UTC m=+1.802470888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.081938 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c80870be1b5ce openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.785414606 +0000 UTC m=+1.806530507,LastTimestamp:2026-03-13 20:27:50.785414606 +0000 UTC m=+1.806530507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.087862 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870c6cbc73 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.794525811 +0000 UTC m=+1.815641702,LastTimestamp:2026-03-13 20:27:50.794525811 +0000 UTC m=+1.815641702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.091782 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80870c83465f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796002911 +0000 UTC m=+1.817118802,LastTimestamp:2026-03-13 20:27:50.796002911 +0000 UTC m=+1.817118802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.095432 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80870c836671 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796011121 +0000 UTC m=+1.817127012,LastTimestamp:2026-03-13 20:27:50.796011121 +0000 UTC m=+1.817127012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.099022 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870c8d5459 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796661849 +0000 UTC m=+1.817777730,LastTimestamp:2026-03-13 20:27:50.796661849 +0000 UTC m=+1.817777730,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.102218 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80870cc19171 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.800085361 +0000 UTC m=+1.821201252,LastTimestamp:2026-03-13 20:27:50.800085361 +0000 UTC m=+1.821201252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.105388 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c80870cf404c0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.80339168 +0000 UTC m=+1.824507571,LastTimestamp:2026-03-13 20:27:50.80339168 +0000 UTC m=+1.824507571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.108874 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872150a61c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.14500662 +0000 UTC m=+2.166122511,LastTimestamp:2026-03-13 20:27:51.14500662 +0000 UTC m=+2.166122511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.112700 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8087220437d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.156774867 +0000 UTC m=+2.177890758,LastTimestamp:2026-03-13 20:27:51.156774867 +0000 UTC m=+2.177890758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.116277 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8087221a6cdc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.158230236 +0000 UTC m=+2.179346127,LastTimestamp:2026-03-13 20:27:51.158230236 +0000 UTC m=+2.179346127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.119810 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872e61cead openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.364234925 +0000 UTC m=+2.385350816,LastTimestamp:2026-03-13 20:27:51.364234925 +0000 UTC m=+2.385350816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.123212 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872f1690f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.376081143 +0000 UTC m=+2.397197034,LastTimestamp:2026-03-13 20:27:51.376081143 +0000 UTC m=+2.397197034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.126947 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872f26d3c6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.377146822 +0000 UTC m=+2.398262713,LastTimestamp:2026-03-13 20:27:51.377146822 +0000 UTC m=+2.398262713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.131099 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808738fe00d2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.542243538 +0000 UTC m=+2.563359429,LastTimestamp:2026-03-13 20:27:51.542243538 +0000 UTC m=+2.563359429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.134558 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808739a645a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.553271205 +0000 UTC m=+2.574387096,LastTimestamp:2026-03-13 20:27:51.553271205 +0000 UTC m=+2.574387096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.140933 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808741344446 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.680017478 +0000 UTC m=+2.701133369,LastTimestamp:2026-03-13 20:27:51.680017478 +0000 UTC m=+2.701133369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.145625 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80874166509d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.683297437 +0000 UTC m=+2.704413378,LastTimestamp:2026-03-13 20:27:51.683297437 +0000 UTC m=+2.704413378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.149522 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c808741d1b8dd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.690336477 +0000 UTC m=+2.711452428,LastTimestamp:2026-03-13 20:27:51.690336477 +0000 UTC m=+2.711452428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.154085 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c8087424d48ed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.698434285 +0000 UTC m=+2.719550206,LastTimestamp:2026-03-13 20:27:51.698434285 +0000 UTC m=+2.719550206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.159702 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087509c98da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.938513114 +0000 UTC m=+2.959629005,LastTimestamp:2026-03-13 20:27:51.938513114 +0000 UTC m=+2.959629005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.165062 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c8087509d5055 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.938560085 +0000 UTC m=+2.959675976,LastTimestamp:2026-03-13 20:27:51.938560085 +0000 UTC m=+2.959675976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.171916 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808750ad742b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.939617835 +0000 UTC m=+2.960733726,LastTimestamp:2026-03-13 20:27:51.939617835 +0000 UTC m=+2.960733726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.178855 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c808750cf12f9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.941821177 +0000 UTC m=+2.962937068,LastTimestamp:2026-03-13 20:27:51.941821177 +0000 UTC m=+2.962937068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.183519 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087517e45c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.95330298 +0000 UTC m=+2.974418861,LastTimestamp:2026-03-13 20:27:51.95330298 +0000 UTC m=+2.974418861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.188271 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c8087519453b4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.95474834 +0000 UTC m=+2.975864231,LastTimestamp:2026-03-13 20:27:51.95474834 +0000 UTC m=+2.975864231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.194527 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80875195e73b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.954851643 +0000 UTC m=+2.975967534,LastTimestamp:2026-03-13 20:27:51.954851643 +0000 UTC m=+2.975967534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.199244 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087519852d2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.955010258 +0000 UTC m=+2.976126149,LastTimestamp:2026-03-13 20:27:51.955010258 +0000 UTC m=+2.976126149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.204093 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c808751ac1192 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.956304274 +0000 UTC m=+2.977420165,LastTimestamp:2026-03-13 20:27:51.956304274 +0000 UTC m=+2.977420165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.211431 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c808751db23fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.959389181 +0000 UTC m=+2.980505072,LastTimestamp:2026-03-13 20:27:51.959389181 +0000 UTC m=+2.980505072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.216210 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80875f9f0dc7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.190332359 +0000 UTC m=+3.211448250,LastTimestamp:2026-03-13 20:27:52.190332359 +0000 UTC m=+3.211448250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.221038 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80875fc6bb40 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.192932672 +0000 UTC m=+3.214048563,LastTimestamp:2026-03-13 20:27:52.192932672 +0000 UTC m=+3.214048563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.225476 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80876095a90b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.206493963 +0000 UTC m=+3.227609854,LastTimestamp:2026-03-13 20:27:52.206493963 +0000 UTC m=+3.227609854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.230095 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c808760acd19d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.208011677 +0000 UTC m=+3.229127568,LastTimestamp:2026-03-13 20:27:52.208011677 +0000 UTC m=+3.229127568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.234195 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808760cba7c8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.210032584 +0000 UTC m=+3.231148475,LastTimestamp:2026-03-13 20:27:52.210032584 +0000 UTC m=+3.231148475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.239702 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808760e3481d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.211580957 +0000 UTC m=+3.232696848,LastTimestamp:2026-03-13 20:27:52.211580957 +0000 UTC m=+3.232696848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.244848 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80876b87d9a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.390138281 +0000 UTC m=+3.411254172,LastTimestamp:2026-03-13 20:27:52.390138281 +0000 UTC m=+3.411254172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.249067 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80876ba03eb7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.391737015 +0000 UTC m=+3.412852926,LastTimestamp:2026-03-13 20:27:52.391737015 +0000 UTC m=+3.412852926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.253436 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80876c76e567 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.405804391 +0000 UTC m=+3.426920282,LastTimestamp:2026-03-13 20:27:52.405804391 +0000 UTC m=+3.426920282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.257731 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80876c90a7c5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.407492549 +0000 UTC m=+3.428608440,LastTimestamp:2026-03-13 20:27:52.407492549 +0000 UTC m=+3.428608440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.263874 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80876ca6ab0c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.40893518 +0000 UTC m=+3.430051071,LastTimestamp:2026-03-13 20:27:52.40893518 +0000 UTC m=+3.430051071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.271642 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808776ce152c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.579290412 +0000 UTC m=+3.600406303,LastTimestamp:2026-03-13 20:27:52.579290412 +0000 UTC m=+3.600406303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.278046 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087779dd855 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.592906325 +0000 UTC m=+3.614022216,LastTimestamp:2026-03-13 20:27:52.592906325 +0000 UTC m=+3.614022216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.284952 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808777afd2bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.594084539 +0000 UTC m=+3.615200420,LastTimestamp:2026-03-13 20:27:52.594084539 +0000 UTC m=+3.615200420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.291668 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80877eeafb6c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.715402092 +0000 UTC m=+3.736517983,LastTimestamp:2026-03-13 20:27:52.715402092 +0000 UTC m=+3.736517983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.299055 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808783683d07 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.790719751 +0000 UTC m=+3.811835642,LastTimestamp:2026-03-13 20:27:52.790719751 +0000 UTC m=+3.811835642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.302962 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087843b66ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.804558591 +0000 UTC m=+3.825674482,LastTimestamp:2026-03-13 20:27:52.804558591 +0000 UTC m=+3.825674482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.307910 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80878b2e9bd5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.921160661 +0000 UTC m=+3.942276552,LastTimestamp:2026-03-13 20:27:52.921160661 +0000 UTC m=+3.942276552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.311789 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80878c022ec3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.935026371 +0000 UTC m=+3.956142252,LastTimestamp:2026-03-13 20:27:52.935026371 +0000 UTC m=+3.956142252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.316238 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087bbd7f58e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:53.737565582 +0000 UTC m=+4.758681473,LastTimestamp:2026-03-13 20:27:53.737565582 +0000 UTC m=+4.758681473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.319529 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087cdc9a530 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.038617392 +0000 UTC m=+5.059733283,LastTimestamp:2026-03-13 20:27:54.038617392 +0000 UTC m=+5.059733283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.323432 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ce56a156 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.047856982 +0000 UTC m=+5.068972883,LastTimestamp:2026-03-13 20:27:54.047856982 +0000 UTC m=+5.068972883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.329014 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ce70a248 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.04956116 +0000 UTC m=+5.070677081,LastTimestamp:2026-03-13 20:27:54.04956116 +0000 UTC m=+5.070677081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.332819 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087d7f9ab7b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.209536891 +0000 UTC m=+5.230652782,LastTimestamp:2026-03-13 20:27:54.209536891 +0000 UTC m=+5.230652782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.336814 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087d895404c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.219733068 +0000 UTC m=+5.240848959,LastTimestamp:2026-03-13 20:27:54.219733068 +0000 UTC m=+5.240848959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.340831 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087d8a8ceb9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.221014713 +0000 UTC m=+5.242130604,LastTimestamp:2026-03-13 20:27:54.221014713 +0000 UTC m=+5.242130604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.345025 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087e273295f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.385271135 +0000 UTC m=+5.406387026,LastTimestamp:2026-03-13 20:27:54.385271135 +0000 UTC m=+5.406387026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.349737 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087e37166ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.401933034 +0000 UTC m=+5.423048935,LastTimestamp:2026-03-13 20:27:54.401933034 +0000 UTC m=+5.423048935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.354818 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087e387fdd0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.403413456 +0000 UTC m=+5.424529347,LastTimestamp:2026-03-13 20:27:54.403413456 +0000 UTC m=+5.424529347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.359074 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ed5e8061 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.568466529 +0000 UTC m=+5.589582410,LastTimestamp:2026-03-13 20:27:54.568466529 +0000 UTC m=+5.589582410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.362870 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ede07770 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.57698392 +0000 UTC m=+5.598099811,LastTimestamp:2026-03-13 20:27:54.57698392 +0000 UTC m=+5.598099811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.367272 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087edff68a2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.579011746 +0000 UTC m=+5.600127637,LastTimestamp:2026-03-13 20:27:54.579011746 +0000 UTC m=+5.600127637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.372214 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087f9db8a97 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.777987735 +0000 UTC m=+5.799103626,LastTimestamp:2026-03-13 20:27:54.777987735 +0000 UTC m=+5.799103626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.376697 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087fa8c9aa9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.789591721 +0000 UTC m=+5.810707622,LastTimestamp:2026-03-13 20:27:54.789591721 +0000 UTC m=+5.810707622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.381737 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c8088585fde92 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:14 crc kubenswrapper[4790]: body: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:56.36371829 +0000 UTC m=+7.384834271,LastTimestamp:2026-03-13 20:27:56.36371829 +0000 UTC m=+7.384834271,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.387887 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808858623e0c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:56.363873804 +0000 UTC m=+7.384989735,LastTimestamp:2026-03-13 20:27:56.363873804 +0000 UTC m=+7.384989735,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.395074 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c8089d8276554 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 13 20:28:14 crc kubenswrapper[4790]: body: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:02.80246818 +0000 UTC m=+13.823584161,LastTimestamp:2026-03-13 20:28:02.80246818 +0000 UTC m=+13.823584161,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.400454 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8089d828b8e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:02.802555112 +0000 UTC m=+13.823671043,LastTimestamp:2026-03-13 20:28:02.802555112 +0000 UTC m=+13.823671043,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.405137 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c808a184ddd5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 20:28:14 crc kubenswrapper[4790]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:14 crc kubenswrapper[4790]: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878731101 +0000 UTC m=+14.899846992,LastTimestamp:2026-03-13 20:28:03.878731101 +0000 UTC m=+14.899846992,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.410007 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808a184e923b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878777403 +0000 UTC m=+14.899893294,LastTimestamp:2026-03-13 20:28:03.878777403 +0000 UTC m=+14.899893294,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.415641 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c808a184ddd5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c808a184ddd5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 20:28:14 crc kubenswrapper[4790]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:14 crc kubenswrapper[4790]: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878731101 +0000 UTC m=+14.899846992,LastTimestamp:2026-03-13 20:28:03.884397961 +0000 UTC m=+14.905513852,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.420421 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c808a184e923b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808a184e923b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878777403 +0000 UTC m=+14.899893294,LastTimestamp:2026-03-13 20:28:03.884458882 +0000 UTC m=+14.905574773,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.425179 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c808a2c976de3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 13 20:28:14 crc kubenswrapper[4790]: body: [+]ping ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]log ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]etcd ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-filter ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-informers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-controllers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/crd-informer-synced ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-system-namespaces-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 13 20:28:14 crc kubenswrapper[4790]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/bootstrap-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-kube-aggregator-informers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-registration-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-discovery-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]autoregister-completion ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapi-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: livez check failed Mar 13 20:28:14 crc kubenswrapper[4790]: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:04.219096547 +0000 UTC m=+15.240212468,LastTimestamp:2026-03-13 20:28:04.219096547 +0000 UTC m=+15.240212468,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.431688 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c808aac800c25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:14 crc kubenswrapper[4790]: body: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,LastTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.436847 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808aac8164d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,LastTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: I0313 20:28:14.605037 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:15 crc kubenswrapper[4790]: W0313 20:28:15.475962 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 20:28:15 crc kubenswrapper[4790]: E0313 20:28:15.476024 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:15 crc kubenswrapper[4790]: I0313 20:28:15.605784 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364002 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364140 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364215 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364486 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366789 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366982 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84" gracePeriod=30 Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.369501 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac800c25\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:16 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c808aac800c25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:16 crc kubenswrapper[4790]: body: Mar 13 20:28:16 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,LastTimestamp:2026-03-13 20:28:16.364110168 +0000 UTC m=+27.385226099,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:16 crc kubenswrapper[4790]: > Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.375670 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac8164d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808aac8164d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,LastTimestamp:2026-03-13 20:28:16.36417783 +0000 UTC m=+27.385293761,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.382291 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808d00a91ed2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:16.366960338 +0000 UTC m=+27.388076229,LastTimestamp:2026-03-13 20:28:16.366960338 +0000 UTC m=+27.388076229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.496767 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c80870c8d5459\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870c8d5459 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796661849 +0000 UTC m=+1.817777730,LastTimestamp:2026-03-13 20:28:16.490287908 +0000 UTC m=+27.511403799,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.607701 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.713271 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c80872150a61c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872150a61c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.14500662 +0000 UTC m=+2.166122511,LastTimestamp:2026-03-13 20:28:16.706123061 +0000 UTC m=+27.727238952,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.728107 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c8087220437d3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8087220437d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.156774867 +0000 UTC m=+2.177890758,LastTimestamp:2026-03-13 20:28:16.720622558 +0000 UTC m=+27.741738449,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: W0313 20:28:16.787799 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.787915 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.837077 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.837616 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84" exitCode=255 Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.837683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84"} Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.838014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34"} Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.838175 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.839156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.839211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.839227 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.285422 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287853 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:17 crc kubenswrapper[4790]: E0313 20:28:17.289678 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:17 crc kubenswrapper[4790]: E0313 20:28:17.289818 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.606017 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.840908 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.842257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.842321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.842335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:18 crc kubenswrapper[4790]: I0313 20:28:18.608249 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:19 crc kubenswrapper[4790]: I0313 20:28:19.607494 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:19 crc kubenswrapper[4790]: E0313 20:28:19.716532 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:20 crc kubenswrapper[4790]: W0313 20:28:20.221221 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:20 crc kubenswrapper[4790]: E0313 20:28:20.221595 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:20 crc kubenswrapper[4790]: I0313 20:28:20.606511 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:21 crc kubenswrapper[4790]: I0313 20:28:21.607044 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:22 crc kubenswrapper[4790]: I0313 20:28:22.607867 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.363919 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.364225 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.366091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.366178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.366208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.606918 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.290253 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.291990 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.292071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.292099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.292192 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:24 crc kubenswrapper[4790]: E0313 20:28:24.296319 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:24 crc kubenswrapper[4790]: E0313 20:28:24.296935 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.313955 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.314229 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.315645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.315708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.315726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.606539 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:25 crc kubenswrapper[4790]: I0313 20:28:25.605942 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.365005 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.365350 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:28:26 crc kubenswrapper[4790]: E0313 20:28:26.372442 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac800c25\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:26 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c808aac800c25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:26 crc kubenswrapper[4790]: body: Mar 13 20:28:26 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,LastTimestamp:2026-03-13 20:28:26.36532605 +0000 UTC m=+37.386441981,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:26 crc kubenswrapper[4790]: > Mar 13 20:28:26 crc kubenswrapper[4790]: E0313 20:28:26.377063 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac8164d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808aac8164d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,LastTimestamp:2026-03-13 20:28:26.365554777 +0000 UTC m=+37.386670708,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.606045 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.659852 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661095 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661683 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.869016 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.605651 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.884622 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.886027 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.888769 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" exitCode=255 Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.888830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e"} Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.889053 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.889238 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.891483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.891627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.891718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.892611 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:27 crc kubenswrapper[4790]: E0313 20:28:27.892941 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:28 crc kubenswrapper[4790]: I0313 20:28:28.605633 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:28 crc kubenswrapper[4790]: I0313 20:28:28.893593 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:29 crc kubenswrapper[4790]: I0313 20:28:29.606979 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:29 crc kubenswrapper[4790]: E0313 20:28:29.716680 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:30 crc kubenswrapper[4790]: I0313 20:28:30.606296 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.297320 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299334 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:31 crc kubenswrapper[4790]: E0313 20:28:31.304992 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:31 crc kubenswrapper[4790]: E0313 20:28:31.305041 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.606531 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.605368 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.801737 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.801943 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803157 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803655 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:32 crc kubenswrapper[4790]: E0313 20:28:32.803817 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.368763 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.368974 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.370440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.370484 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.370495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.374221 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.607617 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.911087 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.912415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.912450 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.912463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:34 crc kubenswrapper[4790]: I0313 20:28:34.604955 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:35 crc kubenswrapper[4790]: W0313 20:28:35.012428 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 20:28:35 crc kubenswrapper[4790]: E0313 20:28:35.012515 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:35 crc kubenswrapper[4790]: W0313 20:28:35.022250 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 20:28:35 crc kubenswrapper[4790]: E0313 20:28:35.022324 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:35 crc kubenswrapper[4790]: I0313 20:28:35.606464 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:36 crc kubenswrapper[4790]: I0313 20:28:36.605478 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.605527 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.778719 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.779011 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.780611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.780661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.780675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.781304 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:37 crc kubenswrapper[4790]: E0313 20:28:37.781531 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:37 crc kubenswrapper[4790]: W0313 20:28:37.985606 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 20:28:37 crc kubenswrapper[4790]: E0313 20:28:37.985729 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.305903 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307549 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307602 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:38 crc kubenswrapper[4790]: E0313 20:28:38.310091 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:38 crc kubenswrapper[4790]: E0313 20:28:38.310414 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.607017 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:39 crc kubenswrapper[4790]: I0313 20:28:39.605677 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:39 crc kubenswrapper[4790]: E0313 20:28:39.716892 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:40 crc kubenswrapper[4790]: I0313 20:28:40.605414 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:41 crc kubenswrapper[4790]: W0313 20:28:41.188333 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:41 crc kubenswrapper[4790]: E0313 20:28:41.188407 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:41 crc kubenswrapper[4790]: I0313 20:28:41.605539 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:42 crc kubenswrapper[4790]: I0313 20:28:42.608019 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.608650 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.669825 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.670784 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.672588 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.672642 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.672661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:44 crc kubenswrapper[4790]: I0313 20:28:44.606250 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.310390 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.311747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.311812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.311824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.312140 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:45 crc kubenswrapper[4790]: E0313 20:28:45.315413 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:45 crc kubenswrapper[4790]: E0313 20:28:45.315465 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.606536 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:46 crc kubenswrapper[4790]: I0313 20:28:46.605256 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:47 crc kubenswrapper[4790]: I0313 20:28:47.605147 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:48 crc kubenswrapper[4790]: I0313 20:28:48.606821 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:49 crc kubenswrapper[4790]: I0313 20:28:49.604864 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:49 crc kubenswrapper[4790]: E0313 20:28:49.717691 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:50 crc kubenswrapper[4790]: I0313 20:28:50.608551 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.604831 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.658916 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660685 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.978428 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.980836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab"} Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.981078 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.982529 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.982582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.982608 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.315591 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.316957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.316997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.317009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.317030 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:52 crc kubenswrapper[4790]: E0313 20:28:52.320657 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:52 crc kubenswrapper[4790]: E0313 20:28:52.321091 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.607948 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.801046 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.984976 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.985699 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987505 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" exitCode=255 Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab"} Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987969 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987752 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.989094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.989360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.990357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.991419 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:28:52 crc kubenswrapper[4790]: E0313 20:28:52.991694 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.605561 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.991248 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.993981 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.994830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.994869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.994883 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.995399 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:28:53 crc kubenswrapper[4790]: E0313 20:28:53.995551 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.219397 4790 csr.go:261] certificate signing request csr-72vmj is approved, waiting to be issued Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.231016 4790 csr.go:257] certificate signing request csr-72vmj is issued Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.262313 4790 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.452474 4790 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 20:28:55 crc kubenswrapper[4790]: I0313 20:28:55.233684 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 12:34:24.682533494 +0000 UTC Mar 13 20:28:55 crc kubenswrapper[4790]: I0313 20:28:55.234026 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6352h5m29.448512437s for next certificate rotation Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.779069 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.779256 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.780702 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.780738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.780754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.781291 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:28:57 crc kubenswrapper[4790]: E0313 20:28:57.781497 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.321315 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.322910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.322959 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.322977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.323093 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.331236 4790 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.331591 4790 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.331628 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336741 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.353922 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360514 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.369597 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377205 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377215 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.386146 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392884 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.401394 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.401544 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.401572 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.501755 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.601957 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.702674 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.717837 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.803409 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.904038 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.004130 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.104644 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.205502 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.306336 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.406893 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.507798 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.608856 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.709983 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.811117 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.912138 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.013199 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.114302 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.215263 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.316201 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.416990 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.517524 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.618466 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.719508 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.819933 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.920887 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.021277 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.122332 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.223138 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.323743 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.424419 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: I0313 20:29:02.450325 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.524926 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.625788 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.726818 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.827614 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.928483 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.029731 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.130615 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.231769 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.331892 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.432807 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.534016 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.635057 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.736691 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.837719 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.938427 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.039073 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.139328 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.239841 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.341095 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.442053 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.542738 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.643459 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.744043 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.845269 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.946407 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.046757 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.147550 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.248028 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.349275 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.450149 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.551289 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.652191 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.752505 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.853429 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.953655 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.054196 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.155031 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.255429 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.355754 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.456346 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.556974 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.657648 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.758468 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.858833 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.959135 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.060133 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.160762 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.260980 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.361452 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.462047 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.562729 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.663761 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.764113 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.864914 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.966104 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.067037 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.167796 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.268236 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.368552 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.469799 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.570896 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.672018 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.773097 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.874427 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.975749 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.076198 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.177230 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.278617 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.379719 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.480630 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.582265 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.683367 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.718505 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.764612 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769651 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769663 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769693 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.779117 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783888 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.794014 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797810 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797906 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.809871 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.813958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814052 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.825820 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.825974 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.826015 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.926144 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.026782 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.127532 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.227728 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.328029 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.429298 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.530234 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.631262 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.731480 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.831987 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.932930 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.033315 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.134427 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.235554 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.335963 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.436999 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.537301 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.638528 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.739445 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.840546 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.941504 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.042726 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.143071 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.243526 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.344014 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.445154 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.546287 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.647044 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.659760 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.660990 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.661026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.661036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.661571 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.661736 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.747984 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.848702 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.949818 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.050872 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.151435 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.252011 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.353045 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.359955 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.361762 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.455860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.455939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.455964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.456020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.456043 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558919 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.626695 4790 apiserver.go:52] "Watching apiserver" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.640169 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.641594 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-image-registry/node-ca-9tpww","openshift-machine-config-operator/machine-config-daemon-drtsx","openshift-multus/multus-additional-cni-plugins-wq8kp","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75","openshift-multus/network-metrics-daemon-mnf26","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-x4d2p","openshift-network-node-identity/network-node-identity-vrzqb","openshift-multus/multus-x2tjg","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.642263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.642269 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.642728 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643178 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643252 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643308 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.643782 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643794 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.644235 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.644251 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643937 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646303 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646336 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646435 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646464 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646758 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646770 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647454 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647465 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647484 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647859 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648035 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648141 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648215 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648307 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.650951 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.653895 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654180 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654437 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654515 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654719 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.655431 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.655875 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.656860 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.656937 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.657008 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.656882 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.657203 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.657435 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.658553 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.658598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.658886 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.659020 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.660997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661057 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661639 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661795 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661843 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661857 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661504 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662020 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662083 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662175 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.680973 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.697110 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.701751 4790 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.712407 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.726842 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.747988 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.757716 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763243 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763303 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.765013 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.772826 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.782223 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784643 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784766 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784801 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784824 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784869 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784889 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784908 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784926 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784944 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784961 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785002 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785058 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785075 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785110 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785129 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785286 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785396 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785416 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785537 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785558 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785594 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785613 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785653 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785674 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785717 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785743 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785761 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785778 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785797 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785832 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785849 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785868 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785891 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785910 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785930 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785950 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785989 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786050 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786069 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786089 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786152 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786174 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786227 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786283 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786340 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786366 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786405 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786423 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786442 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786460 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786478 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786495 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786534 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786555 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786615 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786670 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786711 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786732 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786754 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786794 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786830 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786847 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786864 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786921 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786961 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787002 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787081 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787099 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787119 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787135 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787155 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787212 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787228 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787245 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787287 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787312 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787372 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787413 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787478 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787501 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787570 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787591 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787611 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787669 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787707 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787725 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787743 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787763 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787785 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787809 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787834 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787870 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787890 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787927 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787949 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787968 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788047 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788088 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785160 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788114 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785598 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788134 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786015 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786307 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786344 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786619 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786956 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787289 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787690 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787964 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788605 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788640 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788668 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.788765 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.288746633 +0000 UTC m=+85.309862644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789019 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789370 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789440 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789499 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789746 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790040 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790282 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790632 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.791540 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.791680 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.791992 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792162 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792524 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792707 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792734 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792760 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792813 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792835 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792855 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792875 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792895 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792959 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792978 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793017 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793049 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793107 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793162 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793290 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793348 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793357 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793395 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793484 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793508 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793531 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793813 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkvj\" (UniqueName: \"kubernetes.io/projected/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-kube-api-access-pmkvj\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793906 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58464a30-7f56-4e13-894e-e53498a85637-mcd-auth-proxy-config\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cni-binary-copy\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-socket-dir-parent\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794211 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58464a30-7f56-4e13-894e-e53498a85637-rootfs\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794258 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-hostroot\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794242 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-daemon-config\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794338 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g48h\" (UniqueName: \"kubernetes.io/projected/c54336a0-5a12-4bf9-9807-337dd352fdb6-kube-api-access-7g48h\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-system-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794448 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794471 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cnibin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794488 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-bin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794585 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05405fad-1758-412e-b3ab-9714a604b207-host\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794631 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-cnibin\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794715 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794723 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-multus-certs\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794774 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794859 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-os-release\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794908 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-conf-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58464a30-7f56-4e13-894e-e53498a85637-proxy-tls\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-system-cni-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795057 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thmq\" (UniqueName: \"kubernetes.io/projected/05405fad-1758-412e-b3ab-9714a604b207-kube-api-access-7thmq\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795184 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795226 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76wc\" (UniqueName: \"kubernetes.io/projected/96d699b6-dfba-4b76-b3e8-0480527aa386-kube-api-access-h76wc\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vjb\" (UniqueName: \"kubernetes.io/projected/58464a30-7f56-4e13-894e-e53498a85637-kube-api-access-h2vjb\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-kubelet\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795421 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6x7z\" (UniqueName: \"kubernetes.io/projected/58c65c62-097b-4179-9ada-1627afa9fef2-kube-api-access-w6x7z\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795573 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-etc-kubernetes\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795630 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795656 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/58c65c62-097b-4179-9ada-1627afa9fef2-hosts-file\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f8e0711-7595-4580-b702-558512c33395-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795727 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795760 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-multus\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-netns\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795814 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795861 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795905 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05405fad-1758-412e-b3ab-9714a604b207-serviceca\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795941 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-binary-copy\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796013 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq7qg\" (UniqueName: \"kubernetes.io/projected/2f8e0711-7595-4580-b702-558512c33395-kube-api-access-fq7qg\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-os-release\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796104 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796154 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796627 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796654 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796681 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796710 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796735 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796761 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796785 4790 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796808 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796830 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796853 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796875 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796898 4790 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796925 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796946 4790 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796969 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795342 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795424 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796169 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796232 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796871 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797130 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797141 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797154 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797324 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797726 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797917 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798069 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798308 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798609 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798984 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799139 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799157 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799830 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800122 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800150 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800170 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800187 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800251 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799324 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799361 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811819 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811846 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811874 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811898 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811920 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811941 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811962 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811983 4790 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812004 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812025 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812048 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812069 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812092 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812114 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812134 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812155 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812176 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812198 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812218 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812239 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812259 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812280 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812302 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812323 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812344 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812365 4790 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822719 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822760 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822778 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822853 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.322830968 +0000 UTC m=+85.343946859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.823925 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.823992 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.824023 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.824194 4790 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.824341 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.324098884 +0000 UTC m=+85.345214805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.824463 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.824671 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825092 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825575 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825773 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825935 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825919 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826180 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826476 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826777 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826838 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826921 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826981 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827120 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827123 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827134 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827553 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827851 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.827885 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827888 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827908 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828044 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.828066 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.828165 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.328139149 +0000 UTC m=+85.349255100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.828257 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.328212541 +0000 UTC m=+85.349328542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828281 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828316 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828618 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828968 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.829330 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.829391 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830046 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830121 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830076 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830345 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830551 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830735 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831263 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831158 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831444 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831751 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831839 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832019 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832607 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832602 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.833188 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.833591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.833765 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.834536 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.837198 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.837541 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.838061 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.839844 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.840998 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.842300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.842731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.843047 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.843295 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.843530 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.844414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.844531 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.844778 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.847433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.849303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.854856 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.855731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856070 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856183 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856484 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856730 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857048 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857110 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857266 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857335 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857423 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858435 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858519 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858671 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858983 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858913 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.859942 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860095 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860237 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860827 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861128 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861175 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861782 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862227 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862480 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862615 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862667 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862699 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.864848 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.864969 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.865049 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.865450 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.865684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.866052 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.866284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.866971 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867187 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867305 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867350 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868013 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868120 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868356 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868393 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868175 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868404 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868601 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868895 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868979 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.876818 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.881124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.887480 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.890317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.894590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g48h\" (UniqueName: \"kubernetes.io/projected/c54336a0-5a12-4bf9-9807-337dd352fdb6-kube-api-access-7g48h\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-system-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913560 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913630 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cnibin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05405fad-1758-412e-b3ab-9714a604b207-host\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913870 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-system-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05405fad-1758-412e-b3ab-9714a604b207-host\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cnibin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-cnibin\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-cnibin\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-bin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914778 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-os-release\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914816 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-multus-certs\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914967 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915012 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-os-release\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58464a30-7f56-4e13-894e-e53498a85637-proxy-tls\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915107 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-system-cni-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-system-cni-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915211 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-conf-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.915226 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-conf-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.915306 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.415283652 +0000 UTC m=+85.436399573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915234 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-multus-certs\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thmq\" (UniqueName: \"kubernetes.io/projected/05405fad-1758-412e-b3ab-9714a604b207-kube-api-access-7thmq\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915409 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h76wc\" (UniqueName: \"kubernetes.io/projected/96d699b6-dfba-4b76-b3e8-0480527aa386-kube-api-access-h76wc\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915561 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-kubelet\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915660 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6x7z\" (UniqueName: \"kubernetes.io/projected/58c65c62-097b-4179-9ada-1627afa9fef2-kube-api-access-w6x7z\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915784 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-kubelet\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915936 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916051 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vjb\" (UniqueName: \"kubernetes.io/projected/58464a30-7f56-4e13-894e-e53498a85637-kube-api-access-h2vjb\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-etc-kubernetes\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-etc-kubernetes\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916433 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916542 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/58c65c62-097b-4179-9ada-1627afa9fef2-hosts-file\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f8e0711-7595-4580-b702-558512c33395-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-multus\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/58c65c62-097b-4179-9ada-1627afa9fef2-hosts-file\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917226 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917275 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-multus\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-netns\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05405fad-1758-412e-b3ab-9714a604b207-serviceca\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-binary-copy\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq7qg\" (UniqueName: \"kubernetes.io/projected/2f8e0711-7595-4580-b702-558512c33395-kube-api-access-fq7qg\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917635 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-os-release\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917690 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkvj\" (UniqueName: \"kubernetes.io/projected/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-kube-api-access-pmkvj\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917796 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917862 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58464a30-7f56-4e13-894e-e53498a85637-mcd-auth-proxy-config\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-socket-dir-parent\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917963 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58464a30-7f56-4e13-894e-e53498a85637-rootfs\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cni-binary-copy\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918045 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-hostroot\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918079 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-daemon-config\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918116 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-netns\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58464a30-7f56-4e13-894e-e53498a85637-rootfs\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-socket-dir-parent\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917904 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.919025 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58464a30-7f56-4e13-894e-e53498a85637-mcd-auth-proxy-config\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-hostroot\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920426 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-os-release\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921712 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921844 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921924 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921998 4790 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922073 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922147 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922237 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922315 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922458 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922566 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922723 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922811 4790 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922891 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922971 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923050 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923137 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923211 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923417 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923515 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923590 4790 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923660 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923730 4790 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923832 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923913 4790 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923986 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.924064 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.924133 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.924968 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925078 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925152 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925228 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925299 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925369 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925483 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925561 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925635 4790 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925706 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925806 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925881 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925956 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926027 4790 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926105 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926180 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926251 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926321 4790 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926418 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926508 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926582 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926660 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926731 4790 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926801 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926881 4790 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926956 4790 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927041 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927122 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927202 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927282 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927588 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927680 4790 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927752 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927836 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927915 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928000 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928082 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928156 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928227 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928304 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928395 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928498 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928574 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928657 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928744 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928827 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928907 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928980 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929049 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929123 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929196 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929268 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929352 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929447 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929519 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929598 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929677 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929747 4790 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929842 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929916 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929985 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930060 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930135 4790 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930222 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930295 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930366 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930543 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930602 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930617 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930629 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930650 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930664 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930677 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930690 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930702 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930720 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930731 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930744 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930757 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930769 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930781 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930792 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930804 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930820 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930842 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930854 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930864 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930873 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930883 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930894 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930903 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930911 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930923 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930932 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930940 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930952 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930961 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930970 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930978 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930986 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930994 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931005 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931013 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931022 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931031 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931039 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931053 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931062 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931070 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931099 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931110 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931119 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931131 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931142 4790 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931156 4790 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931167 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931180 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931190 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931199 4790 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931210 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931219 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931227 4790 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930488 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-bin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.932080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.932293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05405fad-1758-412e-b3ab-9714a604b207-serviceca\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.932872 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cni-binary-copy\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.936594 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g48h\" (UniqueName: \"kubernetes.io/projected/c54336a0-5a12-4bf9-9807-337dd352fdb6-kube-api-access-7g48h\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.937600 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.937705 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76wc\" (UniqueName: \"kubernetes.io/projected/96d699b6-dfba-4b76-b3e8-0480527aa386-kube-api-access-h76wc\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.938269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58464a30-7f56-4e13-894e-e53498a85637-proxy-tls\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.938721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-daemon-config\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.939181 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkvj\" (UniqueName: \"kubernetes.io/projected/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-kube-api-access-pmkvj\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.939719 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-binary-copy\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.940154 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f8e0711-7595-4580-b702-558512c33395-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.940179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.942678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vjb\" (UniqueName: \"kubernetes.io/projected/58464a30-7f56-4e13-894e-e53498a85637-kube-api-access-h2vjb\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.943138 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq7qg\" (UniqueName: \"kubernetes.io/projected/2f8e0711-7595-4580-b702-558512c33395-kube-api-access-fq7qg\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.943804 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.945274 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thmq\" (UniqueName: \"kubernetes.io/projected/05405fad-1758-412e-b3ab-9714a604b207-kube-api-access-7thmq\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.947242 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6x7z\" (UniqueName: \"kubernetes.io/projected/58c65c62-097b-4179-9ada-1627afa9fef2-kube-api-access-w6x7z\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.971116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972647 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972705 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.980171 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.989751 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: W0313 20:29:13.997445 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58 WatchSource:0}: Error finding container 1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58: Status 404 returned error can't find the container with id 1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.999506 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.006293 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.009080 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c WatchSource:0}: Error finding container 884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c: Status 404 returned error can't find the container with id 884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.014564 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.023773 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.031346 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c65c62_097b_4179_9ada_1627afa9fef2.slice/crio-05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed WatchSource:0}: Error finding container 05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed: Status 404 returned error can't find the container with id 05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.033124 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c9dff4_5508_4391_bb03_6710c2b9f3b5.slice/crio-7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0 WatchSource:0}: Error finding container 7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0: Status 404 returned error can't find the container with id 7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0 Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.040762 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.048913 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2tjg" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.050342 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.054450 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x4d2p" event={"ID":"58c65c62-097b-4179-9ada-1627afa9fef2","Type":"ContainerStarted","Data":"05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.056187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.057438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.058357 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96d699b6_dfba_4b76_b3e8_0480527aa386.slice/crio-773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b WatchSource:0}: Error finding container 773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b: Status 404 returned error can't find the container with id 773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.058558 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6a01ef179717ba531d8bd41a0999e48b75a6c0277adf22c04cc1a853f2ae431b"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.061951 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076916 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.084316 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58464a30_7f56_4e13_894e_e53498a85637.slice/crio-d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939 WatchSource:0}: Error finding container d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939: Status 404 returned error can't find the container with id d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939 Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.110003 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05405fad_1758_412e_b3ab_9714a604b207.slice/crio-cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b WatchSource:0}: Error finding container cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b: Status 404 returned error can't find the container with id cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.141793 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod207e7f49_094a_4e59_a8ff_9eacd8d6fe2a.slice/crio-c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15 WatchSource:0}: Error finding container c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15: Status 404 returned error can't find the container with id c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15 Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181285 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181324 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289325 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289336 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334581 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334764 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.334740955 +0000 UTC m=+86.355856846 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334812 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334871 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.334855628 +0000 UTC m=+86.355971609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334797 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334891 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334923 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.33491614 +0000 UTC m=+86.356032031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335062 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335073 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335082 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335106 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.335099915 +0000 UTC m=+86.356215806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335149 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335159 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335166 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335184 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.335178107 +0000 UTC m=+86.356293998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.391936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.391982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.391995 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.392017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.392032 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.436160 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.436342 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.436457 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.436427913 +0000 UTC m=+86.457543814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495346 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495412 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597952 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700521 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803718 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906689 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906762 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008742 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008758 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.068166 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x4d2p" event={"ID":"58c65c62-097b-4179-9ada-1627afa9fef2","Type":"ContainerStarted","Data":"e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.073426 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.076156 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tpww" event={"ID":"05405fad-1758-412e-b3ab-9714a604b207","Type":"ContainerStarted","Data":"2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.076179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tpww" event={"ID":"05405fad-1758-412e-b3ab-9714a604b207","Type":"ContainerStarted","Data":"cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.078432 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09" exitCode=0 Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.078465 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.078495 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerStarted","Data":"773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.081406 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.082915 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" exitCode=0 Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.082995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.091588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" event={"ID":"2f8e0711-7595-4580-b702-558512c33395","Type":"ContainerStarted","Data":"c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.091647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" event={"ID":"2f8e0711-7595-4580-b702-558512c33395","Type":"ContainerStarted","Data":"cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.091659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" event={"ID":"2f8e0711-7595-4580-b702-558512c33395","Type":"ContainerStarted","Data":"b2b5c47a36a821bffe16ccd4f1169622238f672dacf2f9863c33e35119b7c278"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.093283 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.093349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.095873 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.095918 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.098343 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.098403 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.098418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.100566 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111705 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111730 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.115013 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.127037 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.138691 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.151281 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.164250 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.181219 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.197590 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213987 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.219715 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.241498 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.254041 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.267724 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.282730 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.296638 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317210 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317220 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317248 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.319814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.335130 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345123 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345227 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345304 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345277879 +0000 UTC m=+88.366393770 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345343 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345359 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345398 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345414 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345424 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345433 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345422083 +0000 UTC m=+88.366538014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345462 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345450524 +0000 UTC m=+88.366566405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345496 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345507 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345508 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345521 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345528 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345522586 +0000 UTC m=+88.366638477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345544 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345538077 +0000 UTC m=+88.366653968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.347917 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.359958 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.381014 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.401759 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.417647 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419937 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419963 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.430046 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.446874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.447082 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.447169 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.447146882 +0000 UTC m=+88.468262833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.448407 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.460814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.472199 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.483769 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.495447 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522771 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522795 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625369 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.659713 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659969 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659705 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.660100 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.660184 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.660214 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.669932 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.670794 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.671615 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.672237 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.685919 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.686823 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.688243 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.689862 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.691540 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.692253 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.692967 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.694956 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.695889 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.698243 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.699859 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.702940 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.704062 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.706184 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.707136 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.708077 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.711104 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.711963 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.713048 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.714614 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.715307 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.716650 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.718461 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.718933 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.719661 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.720920 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.721420 4790 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.721536 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.725111 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.725685 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.726198 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729262 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729681 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729625 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.730727 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.731284 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.732436 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.733113 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.733989 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.734573 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.735554 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.736507 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.736946 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.737824 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.738397 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.739470 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.740050 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.740605 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.741547 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.742065 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.743026 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.743486 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.743926 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832667 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832690 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937405 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937485 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039609 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039667 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.103334 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696" exitCode=0 Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.103429 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107711 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107779 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.119467 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.130778 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.139196 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142505 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142557 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.149416 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.163710 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.176751 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.193027 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.203653 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.214760 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.228714 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245342 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245324 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245468 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.265694 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.279250 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.297241 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.311512 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347761 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347771 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.449977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450028 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450077 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552236 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552294 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655644 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655736 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758073 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758141 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.860934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.860975 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.860985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.861000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.861011 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964307 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072346 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072373 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072404 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.113411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.113475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.115287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.118644 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375" exitCode=0 Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.118683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.132565 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.144059 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.159284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175573 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175629 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175659 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175595 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.185076 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.194439 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.204414 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.217346 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.227796 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.237453 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.249588 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.260135 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.270245 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281340 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281352 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281622 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.294030 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.304483 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.314172 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.323645 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.332896 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.345991 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.360599 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.369920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370085 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370119 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370225 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370249 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370269 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370314 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370299011 +0000 UTC m=+92.391414912 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370412 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370368403 +0000 UTC m=+92.391484294 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370459 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370469 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370477 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370498 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370491957 +0000 UTC m=+92.391607838 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370527 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370546 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370539888 +0000 UTC m=+92.391655779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370582 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370602 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.37059661 +0000 UTC m=+92.391712501 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.374812 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383664 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.389858 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.401230 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.412025 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.423105 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.435130 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.447484 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.464571 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.470583 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.470881 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.470932 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.470918248 +0000 UTC m=+92.492034139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.479246 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.485948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.485991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.486000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.486015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.486024 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.588755 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.588999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.589009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.589024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.589037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659593 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659596 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.659754 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.659986 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.660175 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.660218 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692485 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692498 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795094 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898135 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898161 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000481 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000492 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108520 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108548 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108559 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.124839 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460" exitCode=0 Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.124949 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.147055 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.159965 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.180481 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.196887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.206807 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210392 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.219236 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.228999 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.238403 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.250998 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.264819 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.283003 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.294273 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.312085 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313304 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.331281 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.344878 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.417002 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521224 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521255 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624044 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624107 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726635 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829409 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829468 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829484 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829535 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935135 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935161 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037949 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.136057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144266 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.147708 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283" exitCode=0 Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.147749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.167405 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.184561 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.202623 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.226814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.238557 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248289 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248317 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.249759 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.274867 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.291552 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.313222 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.329397 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.342455 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350589 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350636 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350681 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.355010 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.364687 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.378050 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.389168 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453893 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557488 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557613 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557629 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658867 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658879 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658952 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659471 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659542 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659617 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.675410 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.687239 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.698769 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.711606 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.725439 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.741482 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763439 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763472 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.764578 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.781134 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.794681 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.805635 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.816630 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.828613 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.843113 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.858012 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865602 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865865 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.871469 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968347 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968405 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968421 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968446 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070446 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.155274 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b" exitCode=0 Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.155315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173234 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173264 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.175198 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.200403 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201131 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.214225 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.217971 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223772 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223812 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.226530 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.237301 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.239151 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244102 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244158 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244168 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.250394 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.257931 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.261558 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266685 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.271801 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.278523 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281680 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281690 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281714 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.286500 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.295340 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.295529 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296834 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296877 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.299528 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.314548 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.328422 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.341546 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.354865 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.368945 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400782 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503179 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605072 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707442 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707889 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707912 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707931 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810051 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810074 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912332 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912409 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912438 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014339 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014362 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014418 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116294 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.161532 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerStarted","Data":"312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.166553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.167365 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.167451 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.167540 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.175199 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.190466 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.201350 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.206109 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.206167 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.210823 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218500 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.224505 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.234397 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.247342 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.262939 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.273637 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.287159 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.298054 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.310437 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.320501 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321297 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.337128 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.352641 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.364828 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.377044 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.392343 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.406093 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.410960 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411135 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.41110746 +0000 UTC m=+100.432223361 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411180 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411283 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411308 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411342 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411283 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411408 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411416 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411431 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411455 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411469 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411473 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.41145172 +0000 UTC m=+100.432567651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411502 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.411490201 +0000 UTC m=+100.432606132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411524 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.411513552 +0000 UTC m=+100.432629493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411557 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.411545672 +0000 UTC m=+100.432661603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.421767 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423303 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423311 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.440165 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.452736 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.468316 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.482874 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.505301 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.512493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.512617 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.512695 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.512677735 +0000 UTC m=+100.533793636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.525170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.526908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527153 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527209 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.539412 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.552247 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.563335 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.574943 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.628941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629274 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629369 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629473 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659705 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.659829 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659897 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659954 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659972 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.660040 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.660184 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.660279 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731982 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834656 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834684 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937883 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937922 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040762 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143422 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143432 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245509 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245565 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245577 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245605 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348744 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348832 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348844 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451780 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554196 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554278 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656165 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656186 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758198 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758261 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860325 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860398 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860414 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962832 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962900 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065067 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065127 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065149 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065162 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167270 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167366 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270192 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372335 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474995 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577763 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577830 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.659435 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.659515 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.659528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.660092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660257 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660450 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660707 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680237 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782912 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885858 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885907 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885961 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988496 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091509 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.177008 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/0.log" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.180518 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840" exitCode=1 Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.180584 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.181824 4790 scope.go:117] "RemoveContainer" containerID="d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193493 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.195755 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.217355 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.233449 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.247060 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.261284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.276089 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.291368 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297888 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297979 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297993 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.305498 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.319826 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.334142 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.345596 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.359269 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.370877 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.386345 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:23Z\\\",\\\"message\\\":\\\"0:29:23.322508 6693 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:29:23.322517 6693 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:29:23.322524 6693 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:29:23.322595 6693 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:29:23.322648 6693 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 20:29:23.322660 6693 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 20:29:23.322691 6693 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 20:29:23.322737 6693 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 20:29:23.322747 6693 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:29:23.322735 6693 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 20:29:23.322765 6693 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 20:29:23.322770 6693 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 20:29:23.322751 6693 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 20:29:23.322797 6693 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:29:23.323000 6693 factory.go:656] Stopping watch factory\\\\nI0313 20:29:23.323035 6693 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.398817 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400165 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400179 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400188 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502278 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502287 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604530 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604541 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604569 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707875 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707972 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809740 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913655 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913682 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016738 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119632 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119707 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.185678 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.186579 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/0.log" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.190285 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" exitCode=1 Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.190325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.190362 4790 scope.go:117] "RemoveContainer" containerID="d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.191513 4790 scope.go:117] "RemoveContainer" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.191805 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.205851 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.214498 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222562 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222602 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222635 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.224533 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.234683 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.249363 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.263556 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.276810 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.291803 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.303981 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.315976 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326781 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326838 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.354460 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.371493 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.385888 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.398884 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.418112 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:23Z\\\",\\\"message\\\":\\\"0:29:23.322508 6693 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:29:23.322517 6693 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:29:23.322524 6693 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:29:23.322595 6693 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:29:23.322648 6693 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 20:29:23.322660 6693 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 20:29:23.322691 6693 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 20:29:23.322737 6693 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 20:29:23.322747 6693 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:29:23.322735 6693 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 20:29:23.322765 6693 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 20:29:23.322770 6693 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 20:29:23.322751 6693 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 20:29:23.322797 6693 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:29:23.323000 6693 factory.go:656] Stopping watch factory\\\\nI0313 20:29:23.323035 6693 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441312 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545143 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545205 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545236 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648520 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648561 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658940 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658955 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658960 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659081 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659224 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659521 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659608 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751556 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854861 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854980 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.957904 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.957968 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.957986 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.958008 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.958024 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061297 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061307 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061340 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164552 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164663 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.197549 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266743 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370280 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370301 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473879 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.576944 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577075 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.679958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680019 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680068 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783346 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886027 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886094 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988453 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988752 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988821 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091398 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193759 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296274 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296292 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296330 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398367 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398423 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.500910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.500973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.500985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.501003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.501014 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603720 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659828 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659902 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659978 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660188 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660347 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660564 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660690 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.678126 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.678446 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.679551 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707161 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707178 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810199 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810227 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912852 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912875 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912895 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015832 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.036814 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.120880 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121282 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.206998 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:28 crc kubenswrapper[4790]: E0313 20:29:28.207307 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.223909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.223964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.223978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.224006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.224020 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326851 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430504 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540911 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645198 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748846 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852011 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852088 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954848 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058367 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058425 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161685 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161695 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.263709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264619 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.367272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.367790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.368034 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.368232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.368530 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471340 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.496827 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.496981 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.496954199 +0000 UTC m=+116.518070120 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497197 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497260 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.497244757 +0000 UTC m=+116.518360678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497450 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497546 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.497526375 +0000 UTC m=+116.518642336 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497778 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497872 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497974 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498129 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.498118712 +0000 UTC m=+116.519234603 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497898 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498310 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498424 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498561 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.498530983 +0000 UTC m=+116.519646974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573651 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573724 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.598335 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.598528 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.598604 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.598586154 +0000 UTC m=+116.619702065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659371 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659456 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659469 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.660118 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.659946 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.659781 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.660185 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676879 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676949 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.690084 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.701931 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.711889 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.720936 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.736357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.752495 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.765137 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779688 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779734 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.783282 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.799482 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.813315 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.837001 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:23Z\\\",\\\"message\\\":\\\"0:29:23.322508 6693 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:29:23.322517 6693 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:29:23.322524 6693 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:29:23.322595 6693 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:29:23.322648 6693 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 20:29:23.322660 6693 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 20:29:23.322691 6693 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 20:29:23.322737 6693 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 20:29:23.322747 6693 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:29:23.322735 6693 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 20:29:23.322765 6693 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 20:29:23.322770 6693 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 20:29:23.322751 6693 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 20:29:23.322797 6693 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:29:23.323000 6693 factory.go:656] Stopping watch factory\\\\nI0313 20:29:23.323035 6693 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.850842 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.863101 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.871998 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881865 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881876 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.883728 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985417 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088246 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190492 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.292807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293924 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.396884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397810 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501530 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605555 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.606006 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.634748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635649 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.654887 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.660930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.660997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.661011 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.661026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.661037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.679707 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685324 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685854 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.702326 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707309 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707327 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707338 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.722945 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727547 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727571 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727633 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.743316 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.743505 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745755 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745767 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849576 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849587 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849616 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953328 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057149 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057289 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161736 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161811 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273768 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.376877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.376966 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.376991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.377019 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.377037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480566 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480589 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480618 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480638 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583479 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583676 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660021 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660106 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660195 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660231 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660502 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660633 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660710 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660910 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687676 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687719 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790549 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.893983 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894069 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997339 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997361 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997405 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.099987 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100071 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202242 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202252 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304828 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304841 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.407825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408095 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408118 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510920 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613888 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.717000 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820863 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923990 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.924004 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.026941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027098 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.129964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130056 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.232939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233067 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338714 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338742 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441133 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441144 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441169 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544989 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648323 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648469 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.658930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.658995 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.658938 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659312 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659474 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.659718 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659983 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751556 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751575 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751588 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854660 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957500 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059566 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059581 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059593 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162131 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162154 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265176 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367407 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367417 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470647 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470659 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573774 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676100 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676139 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676155 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778855 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778966 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881334 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983919 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086565 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086583 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086605 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086625 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189607 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189692 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292127 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394291 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497492 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497515 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497565 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600855 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600922 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660617 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660755 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660894 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.660888 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.661052 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.661596 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.661784 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.677982 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703593 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703675 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806157 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806205 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806242 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910470 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910512 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014503 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014730 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117539 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117584 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117601 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.220892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.220956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.220974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.221003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.221023 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326171 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429547 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429606 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533189 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533242 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637129 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637146 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637188 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740934 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.843985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844141 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946605 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946712 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048968 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152247 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152867 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255981 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359955 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359976 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359990 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463055 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463286 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463528 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.566160 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.566708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.566902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.567221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.567364 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659450 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.659551 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659626 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.659729 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659361 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.659935 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.660039 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671201 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775051 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775713 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878579 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878644 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982247 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982283 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.084999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085150 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188158 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291521 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291531 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393579 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393596 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393606 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497025 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497101 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497140 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.599945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.599985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.599993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.600007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.600019 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.660178 4790 scope.go:117] "RemoveContainer" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.676978 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.695158 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702741 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.715914 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.734159 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.759401 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.773079 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.784978 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.797946 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.804919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805051 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.810594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.823723 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.839439 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.853990 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.869298 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.882574 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.895828 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.906813 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908202 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.925115 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010832 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010856 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010866 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112676 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112714 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215362 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215401 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215414 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.251557 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.253769 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.254143 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.265438 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.280372 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.290338 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.299538 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.311022 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317663 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317696 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.322916 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.334395 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.346324 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.360113 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.373743 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.392049 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.407754 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419932 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.430015 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.447119 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.469322 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.485167 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.500646 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.522956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.522999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.523009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.523024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.523035 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625823 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659259 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659362 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659480 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659511 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659496 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659614 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659690 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659748 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.682448 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.702046 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.719175 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729468 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729481 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729503 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729517 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.735207 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.753815 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.783421 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.801265 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.821094 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831840 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.853094 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.872241 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.885417 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.898078 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.910463 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.923756 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934280 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934331 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934353 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.941625 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.958165 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.977851 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037098 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140935 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244461 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244515 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.260539 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.261455 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.265154 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" exitCode=1 Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.265227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.265299 4790 scope.go:117] "RemoveContainer" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.266489 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.266844 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.281911 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.298852 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.312466 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.323867 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.337770 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346867 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346914 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346939 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.351273 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.365635 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.375390 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.391029 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.402599 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.418205 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.432248 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449846 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449884 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.455163 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.468971 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.481069 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.499141 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.511988 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552757 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655838 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655865 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758840 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758972 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786431 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786468 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.804307 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.822838 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826951 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.838554 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842181 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842207 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.854976 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858632 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858640 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.872422 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.872561 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874166 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874197 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874212 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976368 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976402 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976446 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078636 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078757 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181954 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.271117 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.277276 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.277811 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286704 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.304933 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.322589 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.350941 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.364300 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.376412 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389802 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389829 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.393222 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.408637 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.422668 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.437600 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.451321 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.470402 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.485738 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492187 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492243 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492259 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492269 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.501347 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.516724 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.527861 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.548481 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.567051 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594431 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594550 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.658848 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.658974 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.659256 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.659739 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.659769 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.659971 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.660468 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.661629 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697334 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697410 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800777 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800795 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904648 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904736 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904778 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006836 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006918 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.109590 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.109900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.110246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.110585 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.110739 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213897 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213975 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316596 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316614 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316653 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419651 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419840 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522553 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625326 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625407 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625429 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625453 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625469 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.671583 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728840 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728857 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728897 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832310 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935875 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038640 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038723 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038748 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141766 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141833 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244515 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244538 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347077 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347101 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450585 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450668 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552838 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552855 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655501 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655526 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655537 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659801 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659867 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659905 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660023 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660241 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660651 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660739 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.661570 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758194 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758255 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758266 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861102 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963826 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963870 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963885 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963921 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067468 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170540 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170551 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272904 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.287749 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.289504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.289979 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.301511 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.318243 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.331522 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.345401 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.359272 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.374982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375063 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375075 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375093 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375110 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.374960 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.400615 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.415281 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.427877 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.446748 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.461034 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.473583 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477338 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477434 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477444 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.484700 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.495549 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.506180 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.516289 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.527135 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.539959 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580825 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.670573 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682759 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785135 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785160 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888190 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888274 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991055 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991092 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093450 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093459 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.198911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199309 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199337 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302295 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405828 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.509666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.509738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.509756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.510182 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.510239 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.572782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.572927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.572990 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573003 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.572968419 +0000 UTC m=+148.594084310 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573040 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.573031541 +0000 UTC m=+148.594147542 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.573095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.573142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.573179 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573206 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573252 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.573236307 +0000 UTC m=+148.594352268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573306 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573321 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573332 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573364 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.57335731 +0000 UTC m=+148.594473301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573478 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573513 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573533 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573609 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.573590188 +0000 UTC m=+148.594706109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612402 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659604 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.659735 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659835 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659889 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659847 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.659994 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.660052 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.660102 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.673838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.674033 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.674242 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.674226624 +0000 UTC m=+148.695342515 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715570 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818250 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818276 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920897 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023072 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125652 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228167 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228203 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331197 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331245 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434685 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536913 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536955 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536971 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536995 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639464 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741800 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741857 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741867 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741891 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845369 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845417 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845434 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948010 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948059 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948098 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050890 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154084 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257755 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360182 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360194 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463618 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463673 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566124 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659484 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659503 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.659632 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659507 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.659808 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.659937 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.660010 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667900 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770725 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873703 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873829 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975964 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078784 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078823 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181535 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181703 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283667 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283705 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283735 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283748 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386541 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386555 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386563 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.489789 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.489919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.489972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.490000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.490016 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592776 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592989 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696415 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799584 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799815 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902554 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902642 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902653 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004466 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106402 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106450 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106476 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106488 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209126 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209161 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.310827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311241 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413499 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413508 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413534 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516693 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516870 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.617147 4790 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658795 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658796 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658861 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658870 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659412 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659593 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659687 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659778 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.673479 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.687425 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.699358 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.713706 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.728738 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.740700 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.745544 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.760985 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.773547 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.784967 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.801585 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.821079 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.832452 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.843000 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.858358 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.868558 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.879029 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.889477 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.900987 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.912470 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.234939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.234980 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.234988 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.235003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.235012 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.249537 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254263 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254283 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254348 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.275152 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278846 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278944 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278956 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.292353 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.295943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296131 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.313011 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316763 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.331633 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.331750 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.659753 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.659841 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.659909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660063 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.660314 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660423 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660613 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660741 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659274 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659422 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659502 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659495 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659689 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659826 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659990 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:54 crc kubenswrapper[4790]: E0313 20:29:54.742421 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.659496 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.659730 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.660062 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.660117 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660150 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660211 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660337 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660548 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.660835 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.661098 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659250 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659286 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659250 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659423 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659455 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659518 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659563 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659877 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659248 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.659398 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659494 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659548 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659900 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.659886 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.659993 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.660063 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.681049 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.695887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.712652 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.730296 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.742928 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.745995 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.757357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.769236 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.781813 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.793238 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.823672 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.836262 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.847170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.865205 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.877825 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.891735 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.904631 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.920042 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.933775 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.946280 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344201 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/0.log" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344252 4790 generic.go:334] "Generic (PLEG): container finished" podID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" containerID="fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139" exitCode=1 Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerDied","Data":"fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139"} Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344620 4790 scope.go:117] "RemoveContainer" containerID="fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.359966 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.378340 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.402396 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.423301 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.445292 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.456267 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.466154 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.477144 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.487597 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.498895 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.511790 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.526154 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.540011 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.552274 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.565056 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.578671 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.594454 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.607115 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.622466 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.661873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.662205 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.662650 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662348 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.662244 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662759 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662887 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662930 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675834 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675847 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675875 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.687979 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.691797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.691919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.692002 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.692094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.692167 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.705590 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708749 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708933 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708996 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.719702 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722298 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722699 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.734307 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738273 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738358 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738540 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.750024 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.750415 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.349166 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/0.log" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.349234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e"} Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.363663 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.380602 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.397246 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.410657 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.422548 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.433574 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.462620 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.476723 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.488442 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.507013 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.523006 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.534771 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.543225 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.552846 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.563094 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.572965 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.583805 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.595733 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.610279 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.804895 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.816152 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.837563 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.852737 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.872887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.887861 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.901550 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.915332 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.926188 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.940734 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.951794 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.964164 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.975159 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.988457 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.001114 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.019538 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.054673 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.067998 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.082554 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.098204 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.659930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.660022 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660087 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660177 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.660271 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660604 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.660929 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660997 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:04 crc kubenswrapper[4790]: E0313 20:30:04.744216 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659272 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.659887 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659348 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.659983 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659459 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.660058 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659341 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.660125 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659753 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659868 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.659924 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659868 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.660093 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.660173 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.660354 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:08 crc kubenswrapper[4790]: I0313 20:30:08.659735 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.370323 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.373109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.373695 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.389549 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.403038 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.416542 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.430815 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.444709 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.458832 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.481118 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.501129 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.517528 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.537038 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.551682 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.564698 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.577594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.595345 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.604646 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.614207 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.628499 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.639709 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.652781 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659064 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659221 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659239 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659308 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659352 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659427 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659477 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.680519 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.694495 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.705786 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.723684 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.744721 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.747270 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.761626 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.771068 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.780097 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.789819 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.798245 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.808523 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.821594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.833303 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.848495 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.863602 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.877258 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.893594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.907479 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.917582 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.379067 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.379769 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.382518 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" exitCode=1 Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.382602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.382692 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.383956 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:10 crc kubenswrapper[4790]: E0313 20:30:10.384359 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.397315 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.411055 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.423887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.437336 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.451151 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.467154 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.481199 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.495284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.512926 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.527692 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.545370 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.558656 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.568612 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.583168 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.605128 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.620231 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.634592 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.652179 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.669170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.391730 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.398129 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.398459 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.416729 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.433713 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.451305 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.465144 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.478170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.490546 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.515180 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.529586 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.542368 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.563207 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.577167 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.587164 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.597589 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.607593 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.618558 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.630557 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.642016 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.654471 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659265 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659325 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659369 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659334 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659502 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659567 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.669541 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757485 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.773637 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777693 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.794903 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799013 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799089 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799120 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.812152 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815669 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815712 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815803 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.834138 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838316 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838367 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.851527 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.851707 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659503 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659572 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.659769 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.659956 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.660072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.660180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:14 crc kubenswrapper[4790]: E0313 20:30:14.746437 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.658958 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.659591 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.659792 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.659872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.659806 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.660028 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.660092 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.660551 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.605644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.605884 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.605856817 +0000 UTC m=+212.626972708 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606331 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606395 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.606363442 +0000 UTC m=+212.627479333 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606457 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606534 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606545 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606567 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606586 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606581 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606648 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.606628229 +0000 UTC m=+212.627744190 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606699 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.60666882 +0000 UTC m=+212.627784811 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606586 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606783 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.606761753 +0000 UTC m=+212.627877784 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.659861 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.659878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.660111 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660240 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.660266 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660507 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660602 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660680 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.707462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.707753 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.707850 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.707829163 +0000 UTC m=+212.728945054 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659271 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659339 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659434 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659527 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659546 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659582 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659658 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659739 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.679904 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.693216 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.705059 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.724328 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.739054 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.746988 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.749843 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.760957 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.772555 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.784391 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.793493 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.803769 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.815736 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.830148 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.843210 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.859286 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.870486 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.880285 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.891653 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.901793 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659792 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659874 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659917 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659827 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.659957 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.660242 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.660363 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.660443 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243772 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243780 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.257405 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261267 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.278314 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282365 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282407 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.294714 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298518 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298563 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.314590 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318845 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.330536 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.330651 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659251 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659306 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659320 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659399 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659456 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659652 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:24 crc kubenswrapper[4790]: I0313 20:30:24.660333 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:24 crc kubenswrapper[4790]: E0313 20:30:24.660828 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:24 crc kubenswrapper[4790]: E0313 20:30:24.748453 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659125 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659193 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659207 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659278 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659284 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659402 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659563 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659715 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659536 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659549 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659604 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659673 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659488 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659779 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659152 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659506 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659594 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659708 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659815 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659888 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.675448 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.689130 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.702074 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.716157 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.730782 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.742500 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.748988 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.756489 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.772229 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.787209 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.801350 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.828858 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.850171 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.872135 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.886016 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.895091 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.906497 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.916808 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.927415 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.936663 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.659873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.659932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.659894 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660023 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.660212 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660215 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660273 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660328 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498192 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498261 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.510912 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516196 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516220 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516274 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.531321 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536459 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536508 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.556143 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560096 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.576757 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580937 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.595295 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.595596 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659621 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659840 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.660569 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.660593 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.660821 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.661000 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:34 crc kubenswrapper[4790]: E0313 20:30:34.750824 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659276 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659498 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659520 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659600 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659681 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659783 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659835 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659905 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.659127 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.659349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.659425 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.659460 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.660354 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.660630 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.660892 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.661040 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.661185 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.661294 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659692 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659737 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659876 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.659893 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659916 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.660261 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.660422 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.660520 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.696706 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x2tjg" podStartSLOduration=124.696679534 podStartE2EDuration="2m4.696679534s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.695122069 +0000 UTC m=+170.716237970" watchObservedRunningTime="2026-03-13 20:30:39.696679534 +0000 UTC m=+170.717795465" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.751609 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.787725 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podStartSLOduration=125.787700736 podStartE2EDuration="2m5.787700736s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.777844575 +0000 UTC m=+170.798960476" watchObservedRunningTime="2026-03-13 20:30:39.787700736 +0000 UTC m=+170.808816627" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.787972 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9tpww" podStartSLOduration=125.787966584 podStartE2EDuration="2m5.787966584s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.78783234 +0000 UTC m=+170.808948231" watchObservedRunningTime="2026-03-13 20:30:39.787966584 +0000 UTC m=+170.809082475" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.801564 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.801542002 podStartE2EDuration="1m12.801542002s" podCreationTimestamp="2026-03-13 20:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.801212453 +0000 UTC m=+170.822328344" watchObservedRunningTime="2026-03-13 20:30:39.801542002 +0000 UTC m=+170.822657903" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.813551 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.813534205 podStartE2EDuration="57.813534205s" podCreationTimestamp="2026-03-13 20:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.813226996 +0000 UTC m=+170.834342887" watchObservedRunningTime="2026-03-13 20:30:39.813534205 +0000 UTC m=+170.834650096" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.918936 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" podStartSLOduration=124.918918468 podStartE2EDuration="2m4.918918468s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.890534347 +0000 UTC m=+170.911650238" watchObservedRunningTime="2026-03-13 20:30:39.918918468 +0000 UTC m=+170.940034359" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.935546 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.935527863 podStartE2EDuration="1m4.935527863s" podCreationTimestamp="2026-03-13 20:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.921024979 +0000 UTC m=+170.942140870" watchObservedRunningTime="2026-03-13 20:30:39.935527863 +0000 UTC m=+170.956643754" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.960716 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x4d2p" podStartSLOduration=125.960694263 podStartE2EDuration="2m5.960694263s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.947761453 +0000 UTC m=+170.968877344" watchObservedRunningTime="2026-03-13 20:30:39.960694263 +0000 UTC m=+170.981810154" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.975476 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" podStartSLOduration=124.975440084 podStartE2EDuration="2m4.975440084s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.962474673 +0000 UTC m=+170.983590564" watchObservedRunningTime="2026-03-13 20:30:39.975440084 +0000 UTC m=+170.996555975" Mar 13 20:30:40 crc kubenswrapper[4790]: I0313 20:30:40.003291 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=85.00326957 podStartE2EDuration="1m25.00326957s" podCreationTimestamp="2026-03-13 20:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:40.002527829 +0000 UTC m=+171.023643720" watchObservedRunningTime="2026-03-13 20:30:40.00326957 +0000 UTC m=+171.024385461" Mar 13 20:30:40 crc kubenswrapper[4790]: I0313 20:30:40.003801 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=56.003794115 podStartE2EDuration="56.003794115s" podCreationTimestamp="2026-03-13 20:29:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.990102473 +0000 UTC m=+171.011218364" watchObservedRunningTime="2026-03-13 20:30:40.003794115 +0000 UTC m=+171.024910006" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659127 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659183 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659354 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659466 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659653 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659678 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659815 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659935 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733664 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733691 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733703 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:42Z","lastTransitionTime":"2026-03-13T20:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.793198 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4"] Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.794263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.806767 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.806872 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.807083 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.807139 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6203dbf-1e64-41e0-9a73-26def8967139-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976142 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6203dbf-1e64-41e0-9a73-26def8967139-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6203dbf-1e64-41e0-9a73-26def8967139-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077568 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6203dbf-1e64-41e0-9a73-26def8967139-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077649 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077711 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077907 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6203dbf-1e64-41e0-9a73-26def8967139-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.078011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6203dbf-1e64-41e0-9a73-26def8967139-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.078123 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.079237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6203dbf-1e64-41e0-9a73-26def8967139-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.084370 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6203dbf-1e64-41e0-9a73-26def8967139-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.104270 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6203dbf-1e64-41e0-9a73-26def8967139-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.121088 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.511269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" event={"ID":"f6203dbf-1e64-41e0-9a73-26def8967139","Type":"ContainerStarted","Data":"d959332980388be3c8e4e5623e9d1477ffca6e76a1a41023a6414e8e009fde45"} Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.511356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" event={"ID":"f6203dbf-1e64-41e0-9a73-26def8967139","Type":"ContainerStarted","Data":"1cfae9b5abcdcffb5916c54c396d3f19b07737986f58ab4f908d43596166f4f3"} Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.527704 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" podStartSLOduration=129.527687154 podStartE2EDuration="2m9.527687154s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:43.525310156 +0000 UTC m=+174.546426047" watchObservedRunningTime="2026-03-13 20:30:43.527687154 +0000 UTC m=+174.548803045" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660573 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660663 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660677 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660743 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660826 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660854 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660923 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.700727 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.711250 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 20:30:44 crc kubenswrapper[4790]: E0313 20:30:44.753004 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659097 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659136 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659523 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659568 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659634 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659689 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.524942 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525469 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/0.log" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525519 4790 generic.go:334] "Generic (PLEG): container finished" podID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" exitCode=1 Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerDied","Data":"9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e"} Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525585 4790 scope.go:117] "RemoveContainer" containerID="fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.526009 4790 scope.go:117] "RemoveContainer" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.526298 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-x2tjg_openshift-multus(207e7f49-094a-4e59-a8ff-9eacd8d6fe2a)\"" pod="openshift-multus/multus-x2tjg" podUID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659081 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.659208 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.659509 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.659950 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.660053 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:48 crc kubenswrapper[4790]: I0313 20:30:48.530783 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.659011 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.659069 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.659112 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660003 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.660038 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660223 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660309 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660505 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.753409 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659208 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659172 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659320 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659492 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659648 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659726 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659932 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:52 crc kubenswrapper[4790]: I0313 20:30:52.660253 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.480651 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mnf26"] Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.481196 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.481413 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.550634 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.553365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.554288 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.660567 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.660796 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.661056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.661132 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.661928 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.661992 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:54 crc kubenswrapper[4790]: E0313 20:30:54.754466 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659638 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.659765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659778 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659829 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.659946 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.660100 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659695 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.661062 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659452 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659522 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659618 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659738 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659836 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659893 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658814 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658865 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658891 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660354 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660460 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660507 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660599 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.660680 4790 scope.go:117] "RemoveContainer" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.683900 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podStartSLOduration=144.683871778 podStartE2EDuration="2m24.683871778s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:53.589080811 +0000 UTC m=+184.610196742" watchObservedRunningTime="2026-03-13 20:30:59.683871778 +0000 UTC m=+190.704987709" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.755041 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:31:00 crc kubenswrapper[4790]: I0313 20:31:00.577171 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:31:00 crc kubenswrapper[4790]: I0313 20:31:00.577238 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221"} Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659570 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659615 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.659955 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.660078 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.660122 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.660178 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659326 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659442 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659489 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659505 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659673 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659815 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659892 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659900 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659934 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659991 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.663879 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.664041 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.664115 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.664611 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.665034 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.665356 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.546765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.586724 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x7zgr"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.587319 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.587487 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.587692 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.588312 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.588956 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.589353 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.589711 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.590360 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.590802 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.591466 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.596477 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.596667 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.603084 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmlmp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.603834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.606624 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.606872 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.607185 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.607657 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.608594 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.608628 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609442 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609635 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609884 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609981 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.613738 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.613957 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614085 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614223 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614240 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614759 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614300 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614370 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614464 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614525 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614574 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615225 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615329 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615365 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615438 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615514 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615439 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615522 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615732 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615762 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615796 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615918 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615991 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.616005 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.617862 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618074 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618243 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618421 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618644 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618825 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.619848 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.620173 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.620312 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.620483 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.622535 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.622861 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623025 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623178 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623321 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623531 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623678 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.626848 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.627367 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.627927 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zws8z"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.628291 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.630052 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jfdgz"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.630708 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.632313 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.632417 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.632314 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zfhhl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.633740 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.634589 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.634962 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.635809 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.636156 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.636514 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.636841 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.640814 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.641656 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.649588 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.650130 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688276 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk6nf\" (UniqueName: \"kubernetes.io/projected/c5db072c-5e1d-4149-99c8-aee1209189ba-kube-api-access-nk6nf\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688299 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-encryption-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688343 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d847db-0b8e-4128-af43-a17fe76b77d9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688366 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-serving-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688582 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688629 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688652 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688702 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99d847db-0b8e-4128-af43-a17fe76b77d9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688726 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688745 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljppd\" (UniqueName: \"kubernetes.io/projected/4b88ca59-d36e-4682-99e1-10ef4fa85e10-kube-api-access-ljppd\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688763 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688806 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688844 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688864 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db072c-5e1d-4149-99c8-aee1209189ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-node-pullsecrets\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-image-import-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688928 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-client\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit-dir\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688967 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688998 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj8kr\" (UniqueName: \"kubernetes.io/projected/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-kube-api-access-lj8kr\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689020 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad15b399-2051-480d-8389-f58f94c10d81-config\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwgb9\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-kube-api-access-fwgb9\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689140 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkjj2\" (UniqueName: \"kubernetes.io/projected/44748a56-ff71-45b3-a67a-34d5bf7ae56b-kube-api-access-dkjj2\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad15b399-2051-480d-8389-f58f94c10d81-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689178 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689200 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689218 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44748a56-ff71-45b3-a67a-34d5bf7ae56b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689277 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4b88ca59-d36e-4682-99e1-10ef4fa85e10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689296 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad15b399-2051-480d-8389-f58f94c10d81-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689319 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4zt\" (UniqueName: \"kubernetes.io/projected/b8f95d7e-96c6-475c-8bef-d72937cc36b4-kube-api-access-qg4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689338 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b88ca59-d36e-4682-99e1-10ef4fa85e10-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689357 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-serving-cert\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5db072c-5e1d-4149-99c8-aee1209189ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8f95d7e-96c6-475c-8bef-d72937cc36b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.707431 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jtczv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.708101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.710868 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.713528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.714664 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.715763 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.715818 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.715925 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.716087 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.716210 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.717946 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.718193 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.718826 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.719225 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.719646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.726101 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.726591 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.726844 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kfl48"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.727068 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.728134 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743003 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743050 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743355 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743638 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743964 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744298 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744503 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744540 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744686 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744783 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744892 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.745369 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.747629 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.748722 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.755802 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.755969 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.755809 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756501 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756596 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756815 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756623 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.757931 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.758988 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.759409 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.759747 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.760002 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.760016 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.760279 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.761105 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.761746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.767325 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.771067 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.771194 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.771844 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.772513 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.774316 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x7zgr"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.775728 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.776126 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.776519 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.777943 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.781779 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.787546 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.787922 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.788166 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.789465 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.789588 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit-dir\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj8kr\" (UniqueName: \"kubernetes.io/projected/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-kube-api-access-lj8kr\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad15b399-2051-480d-8389-f58f94c10d81-config\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwgb9\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-kube-api-access-fwgb9\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792260 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792328 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44748a56-ff71-45b3-a67a-34d5bf7ae56b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792343 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkjj2\" (UniqueName: \"kubernetes.io/projected/44748a56-ff71-45b3-a67a-34d5bf7ae56b-kube-api-access-dkjj2\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad15b399-2051-480d-8389-f58f94c10d81-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4b88ca59-d36e-4682-99e1-10ef4fa85e10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad15b399-2051-480d-8389-f58f94c10d81-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4zt\" (UniqueName: \"kubernetes.io/projected/b8f95d7e-96c6-475c-8bef-d72937cc36b4-kube-api-access-qg4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b88ca59-d36e-4682-99e1-10ef4fa85e10-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792483 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-serving-cert\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5db072c-5e1d-4149-99c8-aee1209189ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792515 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8f95d7e-96c6-475c-8bef-d72937cc36b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792548 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6nf\" (UniqueName: \"kubernetes.io/projected/c5db072c-5e1d-4149-99c8-aee1209189ba-kube-api-access-nk6nf\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792563 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-encryption-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d847db-0b8e-4128-af43-a17fe76b77d9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792662 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-serving-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792841 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792856 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99d847db-0b8e-4128-af43-a17fe76b77d9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792916 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljppd\" (UniqueName: \"kubernetes.io/projected/4b88ca59-d36e-4682-99e1-10ef4fa85e10-kube-api-access-ljppd\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793017 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db072c-5e1d-4149-99c8-aee1209189ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-node-pullsecrets\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-image-import-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-client\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.794278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit-dir\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.795257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad15b399-2051-480d-8389-f58f94c10d81-config\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.796644 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.796873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.797543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4b88ca59-d36e-4682-99e1-10ef4fa85e10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.797891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.799156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.799424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.799720 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.803151 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-serving-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.807721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99d847db-0b8e-4128-af43-a17fe76b77d9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.807883 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808708 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8f95d7e-96c6-475c-8bef-d72937cc36b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.809524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db072c-5e1d-4149-99c8-aee1209189ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.809607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.810940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad15b399-2051-480d-8389-f58f94c10d81-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.810949 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.811035 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.811132 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-node-pullsecrets\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.811891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44748a56-ff71-45b3-a67a-34d5bf7ae56b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.812252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5db072c-5e1d-4149-99c8-aee1209189ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.813797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b88ca59-d36e-4682-99e1-10ef4fa85e10-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.815616 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.816571 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.816696 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d847db-0b8e-4128-af43-a17fe76b77d9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.817035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.817293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-encryption-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.817709 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-client\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.818542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-image-import-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.820224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.820350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-serving-cert\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.820817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821465 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822454 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822606 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.823281 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmlmp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.823309 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.823370 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.824192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.825828 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.827857 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.835410 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.835806 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.836453 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.844330 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.844557 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.853747 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.858910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.858816 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.859027 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pzx4q"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.859877 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jfdgz"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.859958 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.861590 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vp24d"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.862438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.863196 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.865022 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.868577 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jtczv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.870119 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.870807 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.870936 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.872186 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.873731 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.875600 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kfl48"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.876164 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.878477 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.878630 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.879763 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.881323 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.882235 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.882579 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.883918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zfhhl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.885052 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.885563 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.886632 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.887664 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.888555 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.888706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.889777 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zgzvb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.890771 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.890984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.890771 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.892614 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.893044 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.893129 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.893550 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.894355 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zws8z"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.894443 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.895161 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vp24d"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.896111 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.897157 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.898262 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vggp9"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.899033 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.899349 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.900895 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.901772 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.902771 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.903778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.905226 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.907535 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.908691 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.910330 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.910959 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.911044 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jw27w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.914856 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zwfns"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.915316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919203 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919232 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919249 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919340 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.923634 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.924223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zgzvb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.926793 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jw27w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.930193 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.930406 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zwfns"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.931797 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cxj7h"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.932907 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.933419 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.934698 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cxj7h"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.949864 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.969904 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.989933 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.009928 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.015503 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.015551 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.029944 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.030703 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.050711 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.069888 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.090302 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.109946 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.130027 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.150529 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.169878 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.190780 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.209815 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.230449 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.250535 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.276285 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.344996 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad15b399-2051-480d-8389-f58f94c10d81-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.365169 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj8kr\" (UniqueName: \"kubernetes.io/projected/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-kube-api-access-lj8kr\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.384451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwgb9\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-kube-api-access-fwgb9\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399441 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv9j5\" (UniqueName: \"kubernetes.io/projected/4fa77308-6519-4481-b87b-4a1b066bada3-kube-api-access-rv9j5\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-config\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399565 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n25p\" (UniqueName: \"kubernetes.io/projected/3635b091-f7bf-4c6d-bb7a-5723b36f990f-kube-api-access-5n25p\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071ab142-7ad6-43bc-aa6a-e6761ea33b15-serving-cert\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-encryption-config\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400200 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xc7h\" (UniqueName: \"kubernetes.io/projected/071ab142-7ad6-43bc-aa6a-e6761ea33b15-kube-api-access-6xc7h\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400337 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4fa77308-6519-4481-b87b-4a1b066bada3-machine-approver-tls\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400474 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-service-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-auth-proxy-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400665 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzpr9\" (UniqueName: \"kubernetes.io/projected/6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150-kube-api-access-hzpr9\") pod \"downloads-7954f5f757-zfhhl\" (UID: \"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150\") " pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-serving-cert\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400762 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-images\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-policies\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400846 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6b8\" (UniqueName: \"kubernetes.io/projected/a626166a-5d74-4dd9-b838-746731bfedef-kube-api-access-vw6b8\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-trusted-ca\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr5wq\" (UniqueName: \"kubernetes.io/projected/94386d3d-038a-4e4d-9e97-fd04336847a0-kube-api-access-dr5wq\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a626166a-5d74-4dd9-b838-746731bfedef-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-dir\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401051 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-client\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-config\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401128 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94386d3d-038a-4e4d-9e97-fd04336847a0-serving-cert\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401299 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-config\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.401685 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:14.901634808 +0000 UTC m=+205.922750769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.405159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkjj2\" (UniqueName: \"kubernetes.io/projected/44748a56-ff71-45b3-a67a-34d5bf7ae56b-kube-api-access-dkjj2\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.424777 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.443899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljppd\" (UniqueName: \"kubernetes.io/projected/4b88ca59-d36e-4682-99e1-10ef4fa85e10-kube-api-access-ljppd\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.464933 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4zt\" (UniqueName: \"kubernetes.io/projected/b8f95d7e-96c6-475c-8bef-d72937cc36b4-kube-api-access-qg4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.483420 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.484047 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk6nf\" (UniqueName: \"kubernetes.io/projected/c5db072c-5e1d-4149-99c8-aee1209189ba-kube-api-access-nk6nf\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.494027 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502643 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75sc9\" (UniqueName: \"kubernetes.io/projected/bcf10b74-f8ce-4748-a813-5aefe86f13f7-kube-api-access-75sc9\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.502664 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.002636105 +0000 UTC m=+206.023751996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502709 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-proxy-tls\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502748 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-etcd-client\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502822 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502864 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d4b8de-5800-44a1-b2d9-338e4d267866-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af8dabc-a918-4188-8257-112b5f8d71d0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzpr9\" (UniqueName: \"kubernetes.io/projected/6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150-kube-api-access-hzpr9\") pod \"downloads-7954f5f757-zfhhl\" (UID: \"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150\") " pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503100 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-serving-cert\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-images\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-policies\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-srv-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmtd\" (UniqueName: \"kubernetes.io/projected/4e8cc2ad-07fc-4d24-956e-94599d58be06-kube-api-access-qbmtd\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq5sl\" (UniqueName: \"kubernetes.io/projected/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-kube-api-access-hq5sl\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bc71397-bb77-45b3-92c4-77710458d4fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503580 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-webhook-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6b8\" (UniqueName: \"kubernetes.io/projected/a626166a-5d74-4dd9-b838-746731bfedef-kube-api-access-vw6b8\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-service-ca-bundle\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503710 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc71397-bb77-45b3-92c4-77710458d4fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503789 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr5wq\" (UniqueName: \"kubernetes.io/projected/94386d3d-038a-4e4d-9e97-fd04336847a0-kube-api-access-dr5wq\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503837 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-policies\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-images\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504144 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwxj6\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-kube-api-access-kwxj6\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504414 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm22j\" (UniqueName: \"kubernetes.io/projected/5f9c2f7c-9058-4ad2-84a2-037d212792ad-kube-api-access-bm22j\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-dir\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8rjf\" (UniqueName: \"kubernetes.io/projected/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-kube-api-access-b8rjf\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504528 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-client\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-tmpfs\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-csi-data-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpwnr\" (UniqueName: \"kubernetes.io/projected/9e6c6344-8059-43d7-97be-273d115b8471-kube-api-access-gpwnr\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-default-certificate\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504713 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx89m\" (UniqueName: \"kubernetes.io/projected/929728d6-959b-4532-a9de-298aed7edb3f-kube-api-access-vx89m\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504738 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504757 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vps\" (UniqueName: \"kubernetes.io/projected/4af8dabc-a918-4188-8257-112b5f8d71d0-kube-api-access-x9vps\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ljl\" (UniqueName: \"kubernetes.io/projected/2f612fb7-c001-4a97-b17c-008bcf100be1-kube-api-access-42ljl\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-config\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvkr\" (UniqueName: \"kubernetes.io/projected/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-kube-api-access-5jvkr\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504921 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071ab142-7ad6-43bc-aa6a-e6761ea33b15-serving-cert\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-encryption-config\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chlbb\" (UniqueName: \"kubernetes.io/projected/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-kube-api-access-chlbb\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504993 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.505000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.505025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-metrics-certs\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.505090 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-dir\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.506048 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.006036327 +0000 UTC m=+206.027152288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xc7h\" (UniqueName: \"kubernetes.io/projected/071ab142-7ad6-43bc-aa6a-e6761ea33b15-kube-api-access-6xc7h\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506291 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-config\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/929728d6-959b-4532-a9de-298aed7edb3f-signing-key\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-config\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sl24\" (UniqueName: \"kubernetes.io/projected/32d4b8de-5800-44a1-b2d9-338e4d267866-kube-api-access-8sl24\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506538 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506597 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-serving-cert\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506745 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-mountpoint-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-serving-cert\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507278 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507443 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507567 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-auth-proxy-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4fa77308-6519-4481-b87b-4a1b066bada3-machine-approver-tls\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-service-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507809 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4af8dabc-a918-4188-8257-112b5f8d71d0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8ttg\" (UniqueName: \"kubernetes.io/projected/4cfd91e9-ce88-4004-b936-551d50d26a7d-kube-api-access-p8ttg\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508043 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508066 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-images\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-client\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbdr\" (UniqueName: \"kubernetes.io/projected/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-kube-api-access-9gbdr\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508196 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-socket-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-auth-proxy-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509688 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-encryption-config\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508293 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d4b8de-5800-44a1-b2d9-338e4d267866-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2b96\" (UniqueName: \"kubernetes.io/projected/979fe4d1-6e0f-4b07-b994-c183a200a1cc-kube-api-access-l2b96\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-config\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510417 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510431 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-trusted-ca\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510600 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f612fb7-c001-4a97-b17c-008bcf100be1-proxy-tls\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510633 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc71397-bb77-45b3-92c4-77710458d4fe-config\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510694 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31b24f51-5194-4af5-a171-bd55caaf8ded-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-service-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510854 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a626166a-5d74-4dd9-b838-746731bfedef-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510878 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4lb2\" (UniqueName: \"kubernetes.io/projected/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-kube-api-access-h4lb2\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dd2\" (UniqueName: \"kubernetes.io/projected/631645f5-2f1a-41e7-ba2a-a665c827acb5-kube-api-access-t5dd2\") pod \"migrator-59844c95c7-qsg78\" (UID: \"631645f5-2f1a-41e7-ba2a-a665c827acb5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/929728d6-959b-4532-a9de-298aed7edb3f-signing-cabundle\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511013 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8313e458-290f-42ba-8656-dc9dcf0e0b98-trusted-ca\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511160 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-config\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511286 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-srv-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511352 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94386d3d-038a-4e4d-9e97-fd04336847a0-serving-cert\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511370 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"auto-csr-approver-29557230-8pqh8\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-plugins-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511431 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-config\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmp7h\" (UniqueName: \"kubernetes.io/projected/31b24f51-5194-4af5-a171-bd55caaf8ded-kube-api-access-zmp7h\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4tw\" (UniqueName: \"kubernetes.io/projected/21386249-439b-4454-828b-f9da9ecce958-kube-api-access-8t4tw\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv9j5\" (UniqueName: \"kubernetes.io/projected/4fa77308-6519-4481-b87b-4a1b066bada3-kube-api-access-rv9j5\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-stats-auth\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511578 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511612 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8313e458-290f-42ba-8656-dc9dcf0e0b98-metrics-tls\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511644 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n25p\" (UniqueName: \"kubernetes.io/projected/3635b091-f7bf-4c6d-bb7a-5723b36f990f-kube-api-access-5n25p\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511754 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511825 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511879 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e8cc2ad-07fc-4d24-956e-94599d58be06-metrics-tls\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-registration-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.512554 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-config\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-service-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.514054 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-config\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.514206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071ab142-7ad6-43bc-aa6a-e6761ea33b15-serving-cert\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.514822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4fa77308-6519-4481-b87b-4a1b066bada3-machine-approver-tls\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.515062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.515166 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.517483 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94386d3d-038a-4e4d-9e97-fd04336847a0-serving-cert\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.517579 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-trusted-ca\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.520584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.520886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-serving-cert\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.522101 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a626166a-5d74-4dd9-b838-746731bfedef-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.527669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.529036 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.534641 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.547897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.550037 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.558656 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.571251 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.582659 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.590731 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.604670 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.611215 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.613957 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.614172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"auto-csr-approver-29557230-8pqh8\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.614517 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.614707 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.114681532 +0000 UTC m=+206.135797433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-plugins-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmp7h\" (UniqueName: \"kubernetes.io/projected/31b24f51-5194-4af5-a171-bd55caaf8ded-kube-api-access-zmp7h\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615078 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t4tw\" (UniqueName: \"kubernetes.io/projected/21386249-439b-4454-828b-f9da9ecce958-kube-api-access-8t4tw\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615102 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-stats-auth\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8313e458-290f-42ba-8656-dc9dcf0e0b98-metrics-tls\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615239 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615292 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-registration-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e8cc2ad-07fc-4d24-956e-94599d58be06-metrics-tls\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75sc9\" (UniqueName: \"kubernetes.io/projected/bcf10b74-f8ce-4748-a813-5aefe86f13f7-kube-api-access-75sc9\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615360 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-proxy-tls\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-etcd-client\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615437 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615470 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-plugins-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af8dabc-a918-4188-8257-112b5f8d71d0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d4b8de-5800-44a1-b2d9-338e4d267866-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-srv-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq5sl\" (UniqueName: \"kubernetes.io/projected/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-kube-api-access-hq5sl\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmtd\" (UniqueName: \"kubernetes.io/projected/4e8cc2ad-07fc-4d24-956e-94599d58be06-kube-api-access-qbmtd\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-webhook-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bc71397-bb77-45b3-92c4-77710458d4fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-service-ca-bundle\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615831 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615861 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc71397-bb77-45b3-92c4-77710458d4fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615894 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwxj6\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-kube-api-access-kwxj6\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm22j\" (UniqueName: \"kubernetes.io/projected/5f9c2f7c-9058-4ad2-84a2-037d212792ad-kube-api-access-bm22j\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8rjf\" (UniqueName: \"kubernetes.io/projected/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-kube-api-access-b8rjf\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616044 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616064 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-tmpfs\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-csi-data-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpwnr\" (UniqueName: \"kubernetes.io/projected/9e6c6344-8059-43d7-97be-273d115b8471-kube-api-access-gpwnr\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-default-certificate\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx89m\" (UniqueName: \"kubernetes.io/projected/929728d6-959b-4532-a9de-298aed7edb3f-kube-api-access-vx89m\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616234 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vps\" (UniqueName: \"kubernetes.io/projected/4af8dabc-a918-4188-8257-112b5f8d71d0-kube-api-access-x9vps\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42ljl\" (UniqueName: \"kubernetes.io/projected/2f612fb7-c001-4a97-b17c-008bcf100be1-kube-api-access-42ljl\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvkr\" (UniqueName: \"kubernetes.io/projected/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-kube-api-access-5jvkr\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616319 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616371 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616488 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chlbb\" (UniqueName: \"kubernetes.io/projected/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-kube-api-access-chlbb\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616521 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-metrics-certs\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-config\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616565 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/929728d6-959b-4532-a9de-298aed7edb3f-signing-key\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sl24\" (UniqueName: \"kubernetes.io/projected/32d4b8de-5800-44a1-b2d9-338e4d267866-kube-api-access-8sl24\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-serving-cert\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-mountpoint-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616683 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-serving-cert\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616756 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4af8dabc-a918-4188-8257-112b5f8d71d0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616830 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8ttg\" (UniqueName: \"kubernetes.io/projected/4cfd91e9-ce88-4004-b936-551d50d26a7d-kube-api-access-p8ttg\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-images\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616911 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616932 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbdr\" (UniqueName: \"kubernetes.io/projected/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-kube-api-access-9gbdr\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616977 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-socket-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d4b8de-5800-44a1-b2d9-338e4d267866-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617025 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2b96\" (UniqueName: \"kubernetes.io/projected/979fe4d1-6e0f-4b07-b994-c183a200a1cc-kube-api-access-l2b96\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617042 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-registration-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-config\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617078 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f612fb7-c001-4a97-b17c-008bcf100be1-proxy-tls\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc71397-bb77-45b3-92c4-77710458d4fe-config\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617127 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31b24f51-5194-4af5-a171-bd55caaf8ded-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-service-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4lb2\" (UniqueName: \"kubernetes.io/projected/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-kube-api-access-h4lb2\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618202 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5dd2\" (UniqueName: \"kubernetes.io/projected/631645f5-2f1a-41e7-ba2a-a665c827acb5-kube-api-access-t5dd2\") pod \"migrator-59844c95c7-qsg78\" (UID: \"631645f5-2f1a-41e7-ba2a-a665c827acb5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618224 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/929728d6-959b-4532-a9de-298aed7edb3f-signing-cabundle\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618251 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8313e458-290f-42ba-8656-dc9dcf0e0b98-trusted-ca\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618335 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-srv-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.620546 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.621156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.621461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.621544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-socket-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.622628 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.622818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-config\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623195 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-csi-data-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-srv-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623436 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-mountpoint-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.624850 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.624901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-images\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.625351 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-etcd-client\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.625435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e8cc2ad-07fc-4d24-956e-94599d58be06-metrics-tls\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.625660 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.125645479 +0000 UTC m=+206.146761370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.626847 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-tmpfs\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.627553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-service-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.629048 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.629182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc71397-bb77-45b3-92c4-77710458d4fe-config\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.630449 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.630867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.631247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.633029 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.635018 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f612fb7-c001-4a97-b17c-008bcf100be1-proxy-tls\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.635130 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc71397-bb77-45b3-92c4-77710458d4fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.635842 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.636143 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-serving-cert\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.637543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.654803 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.674048 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.676169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.694535 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.710174 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.712302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d4b8de-5800-44a1-b2d9-338e4d267866-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.718861 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.719518 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.219503894 +0000 UTC m=+206.240619785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.726766 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.732773 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.750721 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.771624 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.772822 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.773294 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d4b8de-5800-44a1-b2d9-338e4d267866-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.789817 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.812319 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.821098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.821945 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.32192467 +0000 UTC m=+206.343040631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.832798 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.835222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31b24f51-5194-4af5-a171-bd55caaf8ded-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.854983 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.868353 4790 request.go:700] Waited for 1.008055644s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-certs-default&limit=500&resourceVersion=0 Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.873823 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.892155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.895238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-default-certificate\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.900728 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-stats-auth\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.914615 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.925136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.925475 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.425446646 +0000 UTC m=+206.446562537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.925964 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.926297 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.426284638 +0000 UTC m=+206.447400529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.930020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-metrics-certs\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.931446 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.950191 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.955416 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-service-ca-bundle\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.967394 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.969010 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.970243 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.975117 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.984609 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.991044 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.011369 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.023878 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-proxy-tls\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.026822 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.027734 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.527713768 +0000 UTC m=+206.548829659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.031894 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x7zgr"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.033617 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.037355 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.045028 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.049592 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.061605 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-srv-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.069170 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.089829 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.096665 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af8dabc-a918-4188-8257-112b5f8d71d0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.110364 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.129570 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.130205 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.630180775 +0000 UTC m=+206.651296746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.130731 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.139276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4af8dabc-a918-4188-8257-112b5f8d71d0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.152076 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.170099 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.181718 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.183264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8313e458-290f-42ba-8656-dc9dcf0e0b98-metrics-tls\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.184665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.184715 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.189978 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: W0313 20:31:15.191913 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b88ca59_d36e_4682_99e1_10ef4fa85e10.slice/crio-8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609 WatchSource:0}: Error finding container 8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609: Status 404 returned error can't find the container with id 8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609 Mar 13 20:31:15 crc kubenswrapper[4790]: W0313 20:31:15.192956 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9680aeb7_b61a_46a8_baf5_44715261e4a5.slice/crio-7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54 WatchSource:0}: Error finding container 7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54: Status 404 returned error can't find the container with id 7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54 Mar 13 20:31:15 crc kubenswrapper[4790]: W0313 20:31:15.198400 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869d7601_27fe_4a6a_840b_a9811c4d1e06.slice/crio-261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e WatchSource:0}: Error finding container 261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e: Status 404 returned error can't find the container with id 261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.210699 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.229936 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.231474 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.231639 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.731619904 +0000 UTC m=+206.752735795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.231910 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.232260 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.732248492 +0000 UTC m=+206.753364383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.257545 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.259443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8313e458-290f-42ba-8656-dc9dcf0e0b98-trusted-ca\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.271802 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.293849 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.310648 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.321500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.333312 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.333515 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.833487796 +0000 UTC m=+206.854603697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.334294 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.334911 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.834894084 +0000 UTC m=+206.856009965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.340021 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.349559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.350254 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.370808 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.380445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.381203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-webhook-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.390366 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.410600 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.432913 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.435621 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.436283 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.936270032 +0000 UTC m=+206.957385923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.439437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/929728d6-959b-4532-a9de-298aed7edb3f-signing-key\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.450245 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.456291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/929728d6-959b-4532-a9de-298aed7edb3f-signing-cabundle\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.470292 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.489789 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.495563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.510927 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.530795 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.536955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.537649 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.037622989 +0000 UTC m=+207.058738880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.540537 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-serving-cert\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.550835 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.555623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-config\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.570034 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.590515 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.609917 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.620393 4790 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.620495 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs podName:979fe4d1-6e0f-4b07-b994-c183a200a1cc nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.120471945 +0000 UTC m=+207.141587846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs") pod "machine-config-server-vggp9" (UID: "979fe4d1-6e0f-4b07-b994-c183a200a1cc") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624866 4790 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624890 4790 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624933 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume podName:5f9c2f7c-9058-4ad2-84a2-037d212792ad nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.124914825 +0000 UTC m=+207.146030716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume") pod "dns-default-zwfns" (UID: "5f9c2f7c-9058-4ad2-84a2-037d212792ad") : failed to sync configmap cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624972 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token podName:979fe4d1-6e0f-4b07-b994-c183a200a1cc nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.124952736 +0000 UTC m=+207.146068717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token") pod "machine-config-server-vggp9" (UID: "979fe4d1-6e0f-4b07-b994-c183a200a1cc") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626095 4790 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626139 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert podName:bcf10b74-f8ce-4748-a813-5aefe86f13f7 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.126128728 +0000 UTC m=+207.147244679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert") pod "ingress-canary-cxj7h" (UID: "bcf10b74-f8ce-4748-a813-5aefe86f13f7") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626159 4790 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626191 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls podName:5f9c2f7c-9058-4ad2-84a2-037d212792ad nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.126185239 +0000 UTC m=+207.147301130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls") pod "dns-default-zwfns" (UID: "5f9c2f7c-9058-4ad2-84a2-037d212792ad") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.631856 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.634307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" event={"ID":"99d847db-0b8e-4128-af43-a17fe76b77d9","Type":"ContainerStarted","Data":"ebfe703c3346c51c55d0091fdef8277d072070f295dcf6c21a0d3512628de2cc"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.634357 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" event={"ID":"99d847db-0b8e-4128-af43-a17fe76b77d9","Type":"ContainerStarted","Data":"b816014f13cfdff9d0d158091e5577b68d530c30531691cf5dd060532ad4ad8b"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.636467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" event={"ID":"b8f95d7e-96c6-475c-8bef-d72937cc36b4","Type":"ContainerStarted","Data":"de9a5a029572d8130097c923ea75100942a444a6c4280d1bccce64d2d69cba59"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.636496 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" event={"ID":"b8f95d7e-96c6-475c-8bef-d72937cc36b4","Type":"ContainerStarted","Data":"b62648494389c80e3eaa2b1b2b854ec9a63c118140f231636d4806a9711e69c9"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.637900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.638061 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.138031931 +0000 UTC m=+207.159147842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.638523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.638618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerStarted","Data":"4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.638653 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerStarted","Data":"7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.639514 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.639845 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.139831099 +0000 UTC m=+207.160947090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.640878 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerStarted","Data":"b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.640928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerStarted","Data":"99cf4ef26fb9eb5a3a40ad496b60c26b191859906bd206806ca175b1e727b6b2"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.641734 4790 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-szftl container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.641789 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.641838 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.645948 4790 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ftx7g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.646022 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.651038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.654075 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" event={"ID":"44748a56-ff71-45b3-a67a-34d5bf7ae56b","Type":"ContainerStarted","Data":"e97e6497c51c2fe5530b50c36f5849b6d8cd976e0fc685660defa3b9e67a0c15"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.654137 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" event={"ID":"44748a56-ff71-45b3-a67a-34d5bf7ae56b","Type":"ContainerStarted","Data":"48d283e7a36b98ce9ace9c712d6b36f26cae6a6fb99bf24b120f5b593ad2f89c"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.654154 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" event={"ID":"44748a56-ff71-45b3-a67a-34d5bf7ae56b","Type":"ContainerStarted","Data":"53513cfc7dd35443cac97edfbe3f8b6ea8c9c0ab7472ef6bc6ae0515a7351549"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.655686 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" event={"ID":"c5db072c-5e1d-4149-99c8-aee1209189ba","Type":"ContainerStarted","Data":"32826041d22fbf81f5b23358d558441cd59a28cb615a950d1fb409e66cbb34ab"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.655742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" event={"ID":"c5db072c-5e1d-4149-99c8-aee1209189ba","Type":"ContainerStarted","Data":"cb8e272961bb2b28d937933c7b4e6b41a719555d43e36b797bc307b7e9163e90"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.657095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" event={"ID":"ad15b399-2051-480d-8389-f58f94c10d81","Type":"ContainerStarted","Data":"da9b8ff2acb058a5d935a292de2d3d9c5023c6f57e9d87d6a5c17d9accf74e90"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.657142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" event={"ID":"ad15b399-2051-480d-8389-f58f94c10d81","Type":"ContainerStarted","Data":"874c15ececd0b63d991fbafbb9359795460fd7f07325a5f3001aadfdfe1c3ef3"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.660736 4790 generic.go:334] "Generic (PLEG): container finished" podID="4b88ca59-d36e-4682-99e1-10ef4fa85e10" containerID="781d003f62040f0fc1eb1bf495b03e746bb6b54b3e07185c148fce4a51a3a49d" exitCode=0 Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.662580 4790 generic.go:334] "Generic (PLEG): container finished" podID="1db4655f-49dd-48c8-a290-c3c4f2fb74ba" containerID="5aca94d81c2dfec69adb29425b5bbddde9204e3417b4b3e8b6253c05a7384489" exitCode=0 Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.666756 4790 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zsqd7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.666820 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.672899 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.675978 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" event={"ID":"4b88ca59-d36e-4682-99e1-10ef4fa85e10","Type":"ContainerDied","Data":"781d003f62040f0fc1eb1bf495b03e746bb6b54b3e07185c148fce4a51a3a49d"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" event={"ID":"4b88ca59-d36e-4682-99e1-10ef4fa85e10","Type":"ContainerStarted","Data":"8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerDied","Data":"5aca94d81c2dfec69adb29425b5bbddde9204e3417b4b3e8b6253c05a7384489"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerStarted","Data":"bafc42cbeb4a06000c3f2adeb55459b9581ddf69b803cb49804f88aa0878c560"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerStarted","Data":"b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676105 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerStarted","Data":"261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.690015 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.710591 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.730033 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.739745 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.739831 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.23981528 +0000 UTC m=+207.260931171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.741174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.742261 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.242247896 +0000 UTC m=+207.263363787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.749968 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.770850 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.790731 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.811089 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.831129 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.843402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.843692 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.343663615 +0000 UTC m=+207.364779506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.843938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.844281 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.344272441 +0000 UTC m=+207.365388332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.849314 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.870770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.888718 4790 request.go:700] Waited for 1.955520875s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.890563 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.909615 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.944929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.945669 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.445633448 +0000 UTC m=+207.466749349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.976287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzpr9\" (UniqueName: \"kubernetes.io/projected/6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150-kube-api-access-hzpr9\") pod \"downloads-7954f5f757-zfhhl\" (UID: \"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150\") " pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.989329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6b8\" (UniqueName: \"kubernetes.io/projected/a626166a-5d74-4dd9-b838-746731bfedef-kube-api-access-vw6b8\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.015313 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr5wq\" (UniqueName: \"kubernetes.io/projected/94386d3d-038a-4e4d-9e97-fd04336847a0-kube-api-access-dr5wq\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.029411 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.046480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.046913 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.546895653 +0000 UTC m=+207.568011544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.051489 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xc7h\" (UniqueName: \"kubernetes.io/projected/071ab142-7ad6-43bc-aa6a-e6761ea33b15-kube-api-access-6xc7h\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.072836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.087533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv9j5\" (UniqueName: \"kubernetes.io/projected/4fa77308-6519-4481-b87b-4a1b066bada3-kube-api-access-rv9j5\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.092932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.109835 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n25p\" (UniqueName: \"kubernetes.io/projected/3635b091-f7bf-4c6d-bb7a-5723b36f990f-kube-api-access-5n25p\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.136503 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"auto-csr-approver-29557230-8pqh8\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.145274 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t4tw\" (UniqueName: \"kubernetes.io/projected/21386249-439b-4454-828b-f9da9ecce958-kube-api-access-8t4tw\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147566 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147730 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147789 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.147802 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.647784257 +0000 UTC m=+207.668900148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147865 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.148107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.149533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.150698 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.650681967 +0000 UTC m=+207.671798058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153627 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153786 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153967 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.154076 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.159527 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.171794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.188036 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75sc9\" (UniqueName: \"kubernetes.io/projected/bcf10b74-f8ce-4748-a813-5aefe86f13f7-kube-api-access-75sc9\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.189678 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.212436 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ljl\" (UniqueName: \"kubernetes.io/projected/2f612fb7-c001-4a97-b17c-008bcf100be1-kube-api-access-42ljl\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.214301 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.234282 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.251885 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.252545 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.752525857 +0000 UTC m=+207.773641758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.262565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmp7h\" (UniqueName: \"kubernetes.io/projected/31b24f51-5194-4af5-a171-bd55caaf8ded-kube-api-access-zmp7h\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.266568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwxj6\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-kube-api-access-kwxj6\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.274641 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.302037 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpwnr\" (UniqueName: \"kubernetes.io/projected/9e6c6344-8059-43d7-97be-273d115b8471-kube-api-access-gpwnr\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.324749 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.327824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvkr\" (UniqueName: \"kubernetes.io/projected/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-kube-api-access-5jvkr\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.335168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm22j\" (UniqueName: \"kubernetes.io/projected/5f9c2f7c-9058-4ad2-84a2-037d212792ad-kube-api-access-bm22j\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.343130 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmlmp"] Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.343526 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.346473 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.351523 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.354126 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.354766 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.854750487 +0000 UTC m=+207.875866378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.360559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8rjf\" (UniqueName: \"kubernetes.io/projected/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-kube-api-access-b8rjf\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.374410 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.380968 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbdr\" (UniqueName: \"kubernetes.io/projected/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-kube-api-access-9gbdr\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.385431 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.398826 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2b96\" (UniqueName: \"kubernetes.io/projected/979fe4d1-6e0f-4b07-b994-c183a200a1cc-kube-api-access-l2b96\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.406608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:16 crc kubenswrapper[4790]: W0313 20:31:16.421289 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94386d3d_038a_4e4d_9e97_fd04336847a0.slice/crio-b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65 WatchSource:0}: Error finding container b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65: Status 404 returned error can't find the container with id b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65 Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.441070 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.443198 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bc71397-bb77-45b3-92c4-77710458d4fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.447725 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zws8z"] Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.450041 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.455031 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.455692 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.955672853 +0000 UTC m=+207.976788744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.457703 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chlbb\" (UniqueName: \"kubernetes.io/projected/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-kube-api-access-chlbb\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.462162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.492914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.495437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq5sl\" (UniqueName: \"kubernetes.io/projected/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-kube-api-access-hq5sl\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.524249 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.524437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmtd\" (UniqueName: \"kubernetes.io/projected/4e8cc2ad-07fc-4d24-956e-94599d58be06-kube-api-access-qbmtd\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.525312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sl24\" (UniqueName: \"kubernetes.io/projected/32d4b8de-5800-44a1-b2d9-338e4d267866-kube-api-access-8sl24\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.526971 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8ttg\" (UniqueName: \"kubernetes.io/projected/4cfd91e9-ce88-4004-b936-551d50d26a7d-kube-api-access-p8ttg\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.537450 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.553676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5dd2\" (UniqueName: \"kubernetes.io/projected/631645f5-2f1a-41e7-ba2a-a665c827acb5-kube-api-access-t5dd2\") pod \"migrator-59844c95c7-qsg78\" (UID: \"631645f5-2f1a-41e7-ba2a-a665c827acb5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.557140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.557551 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.057534565 +0000 UTC m=+208.078650456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.565056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.577493 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vps\" (UniqueName: \"kubernetes.io/projected/4af8dabc-a918-4188-8257-112b5f8d71d0-kube-api-access-x9vps\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.598146 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4lb2\" (UniqueName: \"kubernetes.io/projected/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-kube-api-access-h4lb2\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.604024 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.627521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx89m\" (UniqueName: \"kubernetes.io/projected/929728d6-959b-4532-a9de-298aed7edb3f-kube-api-access-vx89m\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.636001 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.639802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.653965 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.699915 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.700012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.700487 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.200471739 +0000 UTC m=+208.221587630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.700584 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.700862 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.701328 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.703102 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.711181 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.722733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.733716 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.770780 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.790119 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.792534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerStarted","Data":"a87ec9f1894e7d9b7b1786a0ef0e7eb51b5127f6208e76aa1e81c2bea75c5403"} Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.792594 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerStarted","Data":"9cbf9d380232bd1754c195399e9a361b22282e4e22905893b49fa8e90e03d8b6"} Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.801265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.801962 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.301938369 +0000 UTC m=+208.323054260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.877560 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" event={"ID":"071ab142-7ad6-43bc-aa6a-e6761ea33b15","Type":"ContainerStarted","Data":"2bfb08f96742cc86b7a606328d661c5f984e9f7428b3ad5d63c6930aa0e96574"} Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.975483 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.982694 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.482662968 +0000 UTC m=+208.503778859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.982895 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.076982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.078041 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.578023243 +0000 UTC m=+208.599139224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.178748 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.178877 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.678854726 +0000 UTC m=+208.699970617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.179271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.179701 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.679689388 +0000 UTC m=+208.700805279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.211290 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jfdgz"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.216066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" event={"ID":"4fa77308-6519-4481-b87b-4a1b066bada3","Type":"ContainerStarted","Data":"227100ec933a6af452ff20a1fdd2ae7a6b5c83e731d26ed986ecd24ac0bd2cd4"} Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.235444 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" podStartSLOduration=163.235426309 podStartE2EDuration="2m43.235426309s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.234137604 +0000 UTC m=+208.255253515" watchObservedRunningTime="2026-03-13 20:31:17.235426309 +0000 UTC m=+208.256542200" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.237209 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cxj7h"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.243165 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.257782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" event={"ID":"94386d3d-038a-4e4d-9e97-fd04336847a0","Type":"ContainerStarted","Data":"b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65"} Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.272971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" event={"ID":"4b88ca59-d36e-4682-99e1-10ef4fa85e10","Type":"ContainerStarted","Data":"efc8fb8838f66d4475c37013c5a7c283b65e44000d6f8ff67542d50e99d54e3a"} Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.273062 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zfhhl"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.273090 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.280463 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.288431 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.288719 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.290260 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.790240034 +0000 UTC m=+208.811355925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.391823 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.397322 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.897301967 +0000 UTC m=+208.918417858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.500866 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.501624 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.001606694 +0000 UTC m=+209.022722585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.513720 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" podStartSLOduration=162.513702352 podStartE2EDuration="2m42.513702352s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.512784987 +0000 UTC m=+208.533900888" watchObservedRunningTime="2026-03-13 20:31:17.513702352 +0000 UTC m=+208.534818243" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.633994 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.634457 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.134445714 +0000 UTC m=+209.155561605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.643761 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" podStartSLOduration=163.643741777 podStartE2EDuration="2m43.643741777s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.643061508 +0000 UTC m=+208.664177399" watchObservedRunningTime="2026-03-13 20:31:17.643741777 +0000 UTC m=+208.664857658" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.672559 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.735154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.735513 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.235487233 +0000 UTC m=+209.256603144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.771999 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" podStartSLOduration=162.771981112 podStartE2EDuration="2m42.771981112s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.766231366 +0000 UTC m=+208.787347267" watchObservedRunningTime="2026-03-13 20:31:17.771981112 +0000 UTC m=+208.793097003" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.818572 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.836547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.836841 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.33682974 +0000 UTC m=+209.357945631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.937113 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.937528 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.437509359 +0000 UTC m=+209.458625250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.040225 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.040772 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.540760158 +0000 UTC m=+209.561876049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.103131 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" podStartSLOduration=163.103116028 podStartE2EDuration="2m43.103116028s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.100042504 +0000 UTC m=+209.121158395" watchObservedRunningTime="2026-03-13 20:31:18.103116028 +0000 UTC m=+209.124231919" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.141585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.142142 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.64193715 +0000 UTC m=+209.663053041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.169930 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" podStartSLOduration=163.169900128 podStartE2EDuration="2m43.169900128s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.156707321 +0000 UTC m=+209.177823212" watchObservedRunningTime="2026-03-13 20:31:18.169900128 +0000 UTC m=+209.191016049" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.242582 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.248317 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.248753 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.748736975 +0000 UTC m=+209.769852916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: W0313 20:31:18.314185 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f612fb7_c001_4a97_b17c_008bcf100be1.slice/crio-7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6 WatchSource:0}: Error finding container 7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6: Status 404 returned error can't find the container with id 7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6 Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.314650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pzx4q" event={"ID":"658b4bb6-837c-48ed-b5f3-aa30bd1e9740","Type":"ContainerStarted","Data":"b42249f3cd9250bbc6e5200abf52b5b898153886f7c6af8892dbd5b3671bbbc1"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.316747 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" podStartSLOduration=164.316734579 podStartE2EDuration="2m44.316734579s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.314531939 +0000 UTC m=+209.335647830" watchObservedRunningTime="2026-03-13 20:31:18.316734579 +0000 UTC m=+209.337850470" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.347211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" event={"ID":"d598b7c0-7c77-4903-9138-d8a3d01f9efe","Type":"ContainerStarted","Data":"3851738f410766329c5133a13a2bdd38c600122354cde8d6b4c645c3b69815b7"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.349920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.350298 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.850282458 +0000 UTC m=+209.871398349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.371153 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" event={"ID":"a626166a-5d74-4dd9-b838-746731bfedef","Type":"ContainerStarted","Data":"660f5500f3fb94b3dbe414e94cd324b8a637bddb7665324b98b41825393454a0"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.396978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vggp9" event={"ID":"979fe4d1-6e0f-4b07-b994-c183a200a1cc","Type":"ContainerStarted","Data":"1a1addd1d481e0bd74e584a8d677c2f02520a9773b0da658e9f30ec883c4da25"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.436059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" event={"ID":"94386d3d-038a-4e4d-9e97-fd04336847a0","Type":"ContainerStarted","Data":"a84716ffc7fb5f6120fa4da6d9fe9147bd141b929386b6b944fa920bcd3f7794"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.436695 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.441078 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-rmlmp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.441120 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" podUID="94386d3d-038a-4e4d-9e97-fd04336847a0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.449799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cxj7h" event={"ID":"bcf10b74-f8ce-4748-a813-5aefe86f13f7","Type":"ContainerStarted","Data":"5a65029229d0cab9fcd1f3d47d5f233fa9fb2bd8556317970877f9af1851b06f"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.451011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.451347 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.951335017 +0000 UTC m=+209.972450908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.464474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerStarted","Data":"b436cb5e3f5d468aba071bbb52490fe41be9f758fe2861c288ae2f9dacadcab0"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.512528 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" podStartSLOduration=163.512513675 podStartE2EDuration="2m43.512513675s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.510876541 +0000 UTC m=+209.531992432" watchObservedRunningTime="2026-03-13 20:31:18.512513675 +0000 UTC m=+209.533629566" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.522047 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.529768 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.548188 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zwfns"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.553136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.554345 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.054325839 +0000 UTC m=+210.075441730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.576504 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.578510 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kfl48"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.591010 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.651586 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.654934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.656887 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.156872918 +0000 UTC m=+210.177988809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.665874 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.689451 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" podStartSLOduration=164.689431971 podStartE2EDuration="2m44.689431971s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.666612682 +0000 UTC m=+209.687728573" watchObservedRunningTime="2026-03-13 20:31:18.689431971 +0000 UTC m=+209.710547862" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.691088 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" podStartSLOduration=164.691082625 podStartE2EDuration="2m44.691082625s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.688128555 +0000 UTC m=+209.709244456" watchObservedRunningTime="2026-03-13 20:31:18.691082625 +0000 UTC m=+209.712198516" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.753967 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" podStartSLOduration=164.753944829 podStartE2EDuration="2m44.753944829s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.753025744 +0000 UTC m=+209.774141655" watchObservedRunningTime="2026-03-13 20:31:18.753944829 +0000 UTC m=+209.775060720" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.756029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.756646 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.256625612 +0000 UTC m=+210.277741503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.860504 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.860991 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.36097677 +0000 UTC m=+210.382092661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.961976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.962250 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.462231144 +0000 UTC m=+210.483347045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.962628 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.963004 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.462993515 +0000 UTC m=+210.484109396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.012533 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.064156 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.064579 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.564564119 +0000 UTC m=+210.585680010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.167262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.167911 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.66789502 +0000 UTC m=+210.689010911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.269151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.269515 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.769497783 +0000 UTC m=+210.790613674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.376703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.377453 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.877430768 +0000 UTC m=+210.898546829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.451267 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.462091 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.475311 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.476972 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jtczv"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.477431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.477824 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.977809529 +0000 UTC m=+210.998925430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.478813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.482159 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vp24d"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.500192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zgzvb"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.516962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerStarted","Data":"e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.519868 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.523075 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.523121 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.535506 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.537323 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.542882 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zfhhl" podStartSLOduration=165.542862662 podStartE2EDuration="2m45.542862662s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:19.538787932 +0000 UTC m=+210.559903823" watchObservedRunningTime="2026-03-13 20:31:19.542862662 +0000 UTC m=+210.563978563" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.580640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.581138 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.08112311 +0000 UTC m=+211.102239001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.611962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zwfns" event={"ID":"5f9c2f7c-9058-4ad2-84a2-037d212792ad","Type":"ContainerStarted","Data":"0eedb6e2e026bf0350d7cfb4fedfa87785c3792514589d8122f9b4d9cd911bcb"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.634835 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jw27w"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.634884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vggp9" event={"ID":"979fe4d1-6e0f-4b07-b994-c183a200a1cc","Type":"ContainerStarted","Data":"7764814c3f81c716083fe3d17dce5b432444f2236b1e438d519d3d6c955d6ac3"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.657169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" event={"ID":"4fa77308-6519-4481-b87b-4a1b066bada3","Type":"ContainerStarted","Data":"f25db4438bee6f278153a4aaf42cb8022dd0c3a47d7c97ce09868dc42bec3cf1"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.681303 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vggp9" podStartSLOduration=6.681282565 podStartE2EDuration="6.681282565s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:19.678485149 +0000 UTC m=+210.699601060" watchObservedRunningTime="2026-03-13 20:31:19.681282565 +0000 UTC m=+210.702398456" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.681867 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.683292 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.183261149 +0000 UTC m=+211.204377040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.686400 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.686954 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.186940508 +0000 UTC m=+211.208056399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.787166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.788410 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.288391868 +0000 UTC m=+211.309507759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.825361 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" event={"ID":"8313e458-290f-42ba-8656-dc9dcf0e0b98","Type":"ContainerStarted","Data":"e7af220ef9cb06ca1618fb14587d09a03ba5e8c64c063b4963a7dff1277a1dfa"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831410 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" event={"ID":"8313e458-290f-42ba-8656-dc9dcf0e0b98","Type":"ContainerStarted","Data":"2c7848e321e3bffdebe62b492ffe330b9d19d7a800b06f31cba20c02985722d0"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831443 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" event={"ID":"3635b091-f7bf-4c6d-bb7a-5723b36f990f","Type":"ContainerStarted","Data":"2fe35e99845f5e357405009c0aeae1924f14ff443ac7a27625d938eed41ee4c9"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831459 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831476 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pzx4q" event={"ID":"658b4bb6-837c-48ed-b5f3-aa30bd1e9740","Type":"ContainerStarted","Data":"ef8810e1999c0e38d934d579b1b6991b98c1734764e9479e6ac6a38a3aac4d83"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" event={"ID":"31b24f51-5194-4af5-a171-bd55caaf8ded","Type":"ContainerStarted","Data":"de0f1d50be7a8f2eacacb074e61715f55608c2fedfc108a8f20a00f9559d0971"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831498 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831510 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831519 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" event={"ID":"2f612fb7-c001-4a97-b17c-008bcf100be1","Type":"ContainerStarted","Data":"3c0be889c426f02bf89b75e3c89fa6c3a6dbbffe59e1eea58f2f512e037baa1d"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831572 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831587 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" event={"ID":"2f612fb7-c001-4a97-b17c-008bcf100be1","Type":"ContainerStarted","Data":"7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831604 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" event={"ID":"21386249-439b-4454-828b-f9da9ecce958","Type":"ContainerStarted","Data":"c6a883977c0139d5e2572820fd8eb84305881145c03efffe0c863048ab150134"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831617 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" event={"ID":"071ab142-7ad6-43bc-aa6a-e6761ea33b15","Type":"ContainerStarted","Data":"257694f26f804f72df81d6133e0baf8090ad14d327a03a5c9806ac387bc5f050"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" event={"ID":"aa273b20-a91d-43ea-a18d-784ad7cdc7a7","Type":"ContainerStarted","Data":"9adf4c0180dfbe028bdf748ff6214a4ac18946f1fa1412ba91b6e4cf19a21205"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" event={"ID":"75413740-91a3-4356-8cbd-4b5d2e7ff7ac","Type":"ContainerStarted","Data":"bc28c370c2cd3eea34feb8211cc516775cf455ee4b1d4b7b2006a39cf1fbccca"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.833438 4790 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jpkh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.833487 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" podUID="75413740-91a3-4356-8cbd-4b5d2e7ff7ac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.838893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" event={"ID":"a626166a-5d74-4dd9-b838-746731bfedef","Type":"ContainerStarted","Data":"ee9bca9a81de3dc588b20df5dee11de8f4a9b928f5c22d19b9d18575c546ec8b"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.848061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerStarted","Data":"af91b2c2002cfba8d95ebe9f9e0aa50107b9d61f68613dde04ff9ae4ab302650"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.884651 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cxj7h" event={"ID":"bcf10b74-f8ce-4748-a813-5aefe86f13f7","Type":"ContainerStarted","Data":"faa34ab9170c82fd862ea9ca00bc38b95f4e671591c94fdc7f41df6a91938ca8"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.884961 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-rmlmp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.884995 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" podUID="94386d3d-038a-4e4d-9e97-fd04336847a0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.888995 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.901374 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.401348239 +0000 UTC m=+211.422464130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.994948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.996401 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.496360485 +0000 UTC m=+211.517476376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.997888 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.005681 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.505663338 +0000 UTC m=+211.526779299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.084794 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-x7zgr container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]log ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]etcd ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/generic-apiserver-start-informers failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/max-in-flight-filter failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 20:31:20 crc kubenswrapper[4790]: livez check failed Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.084860 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" podUID="1db4655f-49dd-48c8-a290-c3c4f2fb74ba" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.104838 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.105238 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.605220226 +0000 UTC m=+211.626336117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.206281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.206673 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.706657595 +0000 UTC m=+211.727773486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.308426 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.308617 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.808587178 +0000 UTC m=+211.829703069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.309072 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.309484 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.809470912 +0000 UTC m=+211.830586803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.340730 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" podStartSLOduration=165.340712769 podStartE2EDuration="2m45.340712769s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.304790695 +0000 UTC m=+211.325906596" watchObservedRunningTime="2026-03-13 20:31:20.340712769 +0000 UTC m=+211.361828660" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.405132 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" podStartSLOduration=166.405110534 podStartE2EDuration="2m46.405110534s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.341190981 +0000 UTC m=+211.362306872" watchObservedRunningTime="2026-03-13 20:31:20.405110534 +0000 UTC m=+211.426226425" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.411731 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.423044 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pzx4q" podStartSLOduration=165.42302445 podStartE2EDuration="2m45.42302445s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.419956597 +0000 UTC m=+211.441072488" watchObservedRunningTime="2026-03-13 20:31:20.42302445 +0000 UTC m=+211.444140341" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.423413 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q5j7f" podStartSLOduration=166.423407461 podStartE2EDuration="2m46.423407461s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.388620337 +0000 UTC m=+211.409736228" watchObservedRunningTime="2026-03-13 20:31:20.423407461 +0000 UTC m=+211.444523352" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.429521 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.929483195 +0000 UTC m=+211.950599076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.459671 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cxj7h" podStartSLOduration=7.459643163 podStartE2EDuration="7.459643163s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.457233087 +0000 UTC m=+211.478348978" watchObservedRunningTime="2026-03-13 20:31:20.459643163 +0000 UTC m=+211.480759054" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.522167 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.524290 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.024270324 +0000 UTC m=+212.045386215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.638093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.638532 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.138509461 +0000 UTC m=+212.159625352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.703453 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.714013 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:20 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.714074 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.740034 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.740429 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.240418493 +0000 UTC m=+212.261534384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.841401 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.842097 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.342078998 +0000 UTC m=+212.363194889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.906982 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerStarted","Data":"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.907031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerStarted","Data":"bad985ac5d6a6fd6a14b185a97704f5e25df7aba222388f921733e6977b5b5eb"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.907611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.912187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" event={"ID":"71ed135e-3db4-4f03-a89e-f82bc3cf0b34","Type":"ContainerStarted","Data":"21dd5065f044c4cde3c563bc54cd1a1d73c878a3e4d4ae12f5615c1c41fddac6"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.921434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" event={"ID":"2f612fb7-c001-4a97-b17c-008bcf100be1","Type":"ContainerStarted","Data":"a1a65fb61f0ec00e53a4d527c48b1b58cdcd0982d95fec32f4e5401038a3b50d"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.937618 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jnbzb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.937712 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.943295 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" event={"ID":"aa273b20-a91d-43ea-a18d-784ad7cdc7a7","Type":"ContainerStarted","Data":"340e5a5971916a43c5b3ad0ffd63cb363edcbf940b0c07728572b29458170d7e"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.944555 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podStartSLOduration=165.944536756 podStartE2EDuration="2m45.944536756s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.9443025 +0000 UTC m=+211.965418401" watchObservedRunningTime="2026-03-13 20:31:20.944536756 +0000 UTC m=+211.965652647" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.945253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.945605 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.445592194 +0000 UTC m=+212.466708085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.956042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" event={"ID":"32d4b8de-5800-44a1-b2d9-338e4d267866","Type":"ContainerStarted","Data":"ce5f8fb1afded31f135561bc74fdabae9e87ea779c474a5d7e5363e7393d45f1"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.956095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" event={"ID":"32d4b8de-5800-44a1-b2d9-338e4d267866","Type":"ContainerStarted","Data":"afcd681e3ca3f274a0d102b69be8865accb5e4227dd4cf0cc0d2bee3d1b47374"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.988613 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" podStartSLOduration=165.98859576 podStartE2EDuration="2m45.98859576s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.970855889 +0000 UTC m=+211.991971780" watchObservedRunningTime="2026-03-13 20:31:20.98859576 +0000 UTC m=+212.009711651" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.993470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" event={"ID":"a626166a-5d74-4dd9-b838-746731bfedef","Type":"ContainerStarted","Data":"5a42b8b73b9fdb6bdfefe8035ac61f25ee593a39779349eb3dea10234e77a77a"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.049321 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.049504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" event={"ID":"929728d6-959b-4532-a9de-298aed7edb3f","Type":"ContainerStarted","Data":"ce676faf3686152a6bfeb6cc2c2d8447af229cebf15ff6aa7a4557faadf52069"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.049550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" event={"ID":"929728d6-959b-4532-a9de-298aed7edb3f","Type":"ContainerStarted","Data":"878f640622a6e67c5340b650a71d21e2cd3303a67d77486fed168166e4311f6b"} Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.051104 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.551083674 +0000 UTC m=+212.572199575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.058566 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" podStartSLOduration=166.057334983 podStartE2EDuration="2m46.057334983s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.013144765 +0000 UTC m=+212.034260656" watchObservedRunningTime="2026-03-13 20:31:21.057334983 +0000 UTC m=+212.078450874" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.076518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" event={"ID":"21386249-439b-4454-828b-f9da9ecce958","Type":"ContainerStarted","Data":"9f5fd8b34062b016215dca4785a4d76e453c761e8bd759cc3e3d46dbbbc45394"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.078071 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" podStartSLOduration=166.078057724 podStartE2EDuration="2m46.078057724s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.048451823 +0000 UTC m=+212.069567724" watchObservedRunningTime="2026-03-13 20:31:21.078057724 +0000 UTC m=+212.099173625" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.090490 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" podStartSLOduration=166.090467521 podStartE2EDuration="2m46.090467521s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.075846395 +0000 UTC m=+212.096962296" watchObservedRunningTime="2026-03-13 20:31:21.090467521 +0000 UTC m=+212.111583422" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.124524 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" podStartSLOduration=166.124505224 podStartE2EDuration="2m46.124505224s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.12216531 +0000 UTC m=+212.143281221" watchObservedRunningTime="2026-03-13 20:31:21.124505224 +0000 UTC m=+212.145621115" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.134914 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.135981 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.144992 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.145272 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.156898 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.159542 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.659525684 +0000 UTC m=+212.680641575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.161954 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" podStartSLOduration=166.161937408 podStartE2EDuration="2m46.161937408s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.155821223 +0000 UTC m=+212.176937124" watchObservedRunningTime="2026-03-13 20:31:21.161937408 +0000 UTC m=+212.183053299" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.175627 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.178963 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" event={"ID":"631645f5-2f1a-41e7-ba2a-a665c827acb5","Type":"ContainerStarted","Data":"e1debe9bc4fb5d653959ee72a515b194439e55c71d3c8cf3bb1c4e3160853ed3"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.179020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" event={"ID":"631645f5-2f1a-41e7-ba2a-a665c827acb5","Type":"ContainerStarted","Data":"4ccdcb482ed2307bbee84e2eeed0b6f5a304d21663576eaa76d1c3639c1c214a"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.191897 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerStarted","Data":"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.210773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerStarted","Data":"93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.210817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerStarted","Data":"1809f43b88080170a440a364505c4febd360a062e9e4aabd772262f808d67b1c"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.213098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" event={"ID":"0bc71397-bb77-45b3-92c4-77710458d4fe","Type":"ContainerStarted","Data":"8b273120e9e6fa6d21db14e7f4043b1874784896f97e4fb7f7e2509ebaba0d0e"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.228514 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" event={"ID":"4af8dabc-a918-4188-8257-112b5f8d71d0","Type":"ContainerStarted","Data":"4fff32c4ea6c00908e2e551a63c388c3c5d8ae382436561b2c16d0fbeeacdf04"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.238145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" event={"ID":"8313e458-290f-42ba-8656-dc9dcf0e0b98","Type":"ContainerStarted","Data":"34345c9d241cc9a8ee4f6aefd62ef1d5124244f04a93c3f3cc8ce155d93b0a68"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.247260 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zwfns" event={"ID":"5f9c2f7c-9058-4ad2-84a2-037d212792ad","Type":"ContainerStarted","Data":"cdaf2a7157fded22fd5efff64f4b448f805d453cc47c87b988c689fd7a955583"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.257795 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" podStartSLOduration=81.257779057 podStartE2EDuration="1m21.257779057s" podCreationTimestamp="2026-03-13 20:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.236748456 +0000 UTC m=+212.257864357" watchObservedRunningTime="2026-03-13 20:31:21.257779057 +0000 UTC m=+212.278894948" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.259481 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.259832 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.260199 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.760175751 +0000 UTC m=+212.781291662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.260498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.260869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.260998 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.760987814 +0000 UTC m=+212.782103765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.272715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" event={"ID":"4e8cc2ad-07fc-4d24-956e-94599d58be06","Type":"ContainerStarted","Data":"bc2f09c9de4d6b44e872e0283c9d0d073285dbdc5330b514dd93f93bf6f21a25"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.272795 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" event={"ID":"4e8cc2ad-07fc-4d24-956e-94599d58be06","Type":"ContainerStarted","Data":"73101d384387fa131c32f69a2704201e88074c0216cd838677f8af7c2487d487"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.283455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"89206cfdc78475e8ddb6a6001e20974b19dbaaadccf47cbc7348fe466c89a707"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.284930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" event={"ID":"4fa77308-6519-4481-b87b-4a1b066bada3","Type":"ContainerStarted","Data":"f8f3ef1fa0a5c5987f219c07af21805cc56c30af75e76960d8709d781e5bcbe4"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.286705 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" event={"ID":"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55","Type":"ContainerStarted","Data":"9cf6faf1a15f9ad2088ec01ab75d065c7c611763631a16bffb9c50a25548558b"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.287417 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.297052 4790 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mcrq2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.297116 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" podUID="c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.300932 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" event={"ID":"4cfd91e9-ce88-4004-b936-551d50d26a7d","Type":"ContainerStarted","Data":"b62be7a9fef85b12a5dc19d34e61675a0b73a7b4e7b5dc0d47df16a785b5e008"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.300979 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" event={"ID":"4cfd91e9-ce88-4004-b936-551d50d26a7d","Type":"ContainerStarted","Data":"ca4753864246a2b6c303ac90d6f0160c76b073a0ba1800b707c270a9f5dc8841"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.324582 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.327027 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" podStartSLOduration=166.326995942 podStartE2EDuration="2m46.326995942s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.323733264 +0000 UTC m=+212.344849165" watchObservedRunningTime="2026-03-13 20:31:21.326995942 +0000 UTC m=+212.348111833" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.329966 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" podStartSLOduration=166.329954812 podStartE2EDuration="2m46.329954812s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.26015324 +0000 UTC m=+212.281269141" watchObservedRunningTime="2026-03-13 20:31:21.329954812 +0000 UTC m=+212.351070703" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.340995 4790 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q2wgf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.341170 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" podUID="4cfd91e9-ce88-4004-b936-551d50d26a7d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.343683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" event={"ID":"31b24f51-5194-4af5-a171-bd55caaf8ded","Type":"ContainerStarted","Data":"293d549b7bb7c41d24c912435a30ac98598b11c8ed0e5af3cfe6278641178582"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.345007 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.368044 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.368399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.368646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.369425 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.869406972 +0000 UTC m=+212.890522863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.371607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.375827 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" podStartSLOduration=166.375804616 podStartE2EDuration="2m46.375804616s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.372583609 +0000 UTC m=+212.393699510" watchObservedRunningTime="2026-03-13 20:31:21.375804616 +0000 UTC m=+212.396920507" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.378110 4790 generic.go:334] "Generic (PLEG): container finished" podID="3635b091-f7bf-4c6d-bb7a-5723b36f990f" containerID="1f34a294151ef6b1c2c7705742307a127d627ac02b120c12574cf0048804f635" exitCode=0 Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.378206 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" event={"ID":"3635b091-f7bf-4c6d-bb7a-5723b36f990f","Type":"ContainerDied","Data":"1f34a294151ef6b1c2c7705742307a127d627ac02b120c12574cf0048804f635"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.407316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.411625 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" podStartSLOduration=166.411554114 podStartE2EDuration="2m46.411554114s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.400698861 +0000 UTC m=+212.421814762" watchObservedRunningTime="2026-03-13 20:31:21.411554114 +0000 UTC m=+212.432670005" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.428955 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" event={"ID":"3baed13c-c4c1-4fc2-9527-bfd2273efbbb","Type":"ContainerStarted","Data":"5198eb8751d46e63c5d90956f3221921504590f0af9ace66aa9925f094473df7"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.458004 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" event={"ID":"75413740-91a3-4356-8cbd-4b5d2e7ff7ac","Type":"ContainerStarted","Data":"a0ef3e6162bf90124e9ee5a7a397ff11c686b96e7f49e283e53be6b7c0e7ccc2"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.460325 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" podStartSLOduration=167.460308936 podStartE2EDuration="2m47.460308936s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.430722564 +0000 UTC m=+212.451838455" watchObservedRunningTime="2026-03-13 20:31:21.460308936 +0000 UTC m=+212.481424827" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.469531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.470057 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.970043199 +0000 UTC m=+212.991159100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.473804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" event={"ID":"d88c0d3c-4e7a-4dd8-a99d-6118b840c031","Type":"ContainerStarted","Data":"7c62368176591927ba848072fa4aeb9ff3ebb04bdfaaa36069f9cbbe368c8b44"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.474119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" event={"ID":"d88c0d3c-4e7a-4dd8-a99d-6118b840c031","Type":"ContainerStarted","Data":"826f2a917cb49a4c81678f749c3be86fc911427c2767f16af7f1d57bf83d6e66"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.495406 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.495466 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.502653 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" podStartSLOduration=166.502637613 podStartE2EDuration="2m46.502637613s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.460914322 +0000 UTC m=+212.482030213" watchObservedRunningTime="2026-03-13 20:31:21.502637613 +0000 UTC m=+212.523753504" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.503028 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.588907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.589495 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" podStartSLOduration=166.589475777 podStartE2EDuration="2m46.589475777s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.527915348 +0000 UTC m=+212.549031239" watchObservedRunningTime="2026-03-13 20:31:21.589475777 +0000 UTC m=+212.610591668" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.590400 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.090362761 +0000 UTC m=+213.111478652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694765 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694866 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694939 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.698847 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.198828331 +0000 UTC m=+213.219944222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.699208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.700529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.701112 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.702997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.707240 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:21 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:21 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:21 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.707278 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.798092 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.29804722 +0000 UTC m=+213.319163111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.809223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.809543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.809604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.810538 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.310521869 +0000 UTC m=+213.331637760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.818977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.893018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.899878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.911197 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.912585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.912871 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.412843912 +0000 UTC m=+213.433959803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.915936 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.014689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.015449 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.515434703 +0000 UTC m=+213.536550594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.115513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.116519 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.616500172 +0000 UTC m=+213.637616063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.131925 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" podStartSLOduration=167.1319062 podStartE2EDuration="2m47.1319062s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.599481019 +0000 UTC m=+212.620596920" watchObservedRunningTime="2026-03-13 20:31:22.1319062 +0000 UTC m=+213.153022101" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.134129 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 20:31:22 crc kubenswrapper[4790]: W0313 20:31:22.167400 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod09cce78a_6bee_4201_82d7_a4e0dd041c9f.slice/crio-78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598 WatchSource:0}: Error finding container 78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598: Status 404 returned error can't find the container with id 78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598 Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.217254 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.217652 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.717635523 +0000 UTC m=+213.738751414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.320575 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.321061 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.821037166 +0000 UTC m=+213.842153057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.421914 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.422293 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.92227415 +0000 UTC m=+213.943390041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.458442 4790 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jpkh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": context deadline exceeded" start-of-body= Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.458510 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" podUID="75413740-91a3-4356-8cbd-4b5d2e7ff7ac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": context deadline exceeded" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.465975 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34002: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.523808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.524167 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.024151982 +0000 UTC m=+214.045267873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.536114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" event={"ID":"631645f5-2f1a-41e7-ba2a-a665c827acb5","Type":"ContainerStarted","Data":"fd32340d1498e106ddfdfcea79637a893f01d9582eaa1568adeb1c2bb2e6b827"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.564497 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" podStartSLOduration=167.564479715 podStartE2EDuration="2m47.564479715s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.563889169 +0000 UTC m=+213.585005060" watchObservedRunningTime="2026-03-13 20:31:22.564479715 +0000 UTC m=+213.585595606" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.573575 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" event={"ID":"4e8cc2ad-07fc-4d24-956e-94599d58be06","Type":"ContainerStarted","Data":"db20261fe3f06ccd31a0b3d8e807bbc22bd1ed42651686b532bf934a6366cb7c"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.585826 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34008: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.612754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" event={"ID":"d88c0d3c-4e7a-4dd8-a99d-6118b840c031","Type":"ContainerStarted","Data":"6f8fbae1ffd9531f9d9211edbdf7ab1cecaf96d184d3a4e60ed1ffbc8ac0fcaa"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.623055 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" podStartSLOduration=167.623033552 podStartE2EDuration="2m47.623033552s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.619802885 +0000 UTC m=+213.640918786" watchObservedRunningTime="2026-03-13 20:31:22.623033552 +0000 UTC m=+213.644149443" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.628142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.628450 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.128438999 +0000 UTC m=+214.149554890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.641733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" event={"ID":"4af8dabc-a918-4188-8257-112b5f8d71d0","Type":"ContainerStarted","Data":"4b1fa189bc57f47fde13d528521c81259420b8568b801996e6c3cfa04b4187ec"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.666712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zwfns" event={"ID":"5f9c2f7c-9058-4ad2-84a2-037d212792ad","Type":"ContainerStarted","Data":"9a63e70d6925e6cb469f20bebea13af4cebd4aed1f83a98e3f2e2e1ef7b88237"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.667070 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.680485 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" event={"ID":"31b24f51-5194-4af5-a171-bd55caaf8ded","Type":"ContainerStarted","Data":"48e4f1927995071e9e6ec8613fddac83aa2a0b65d33722eb0aa5845873e7e4d2"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.719803 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:22 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:22 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:22 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.720135 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.721650 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34016: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.729631 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.729939 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.229913489 +0000 UTC m=+214.251029390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.732162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.736413 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.236373234 +0000 UTC m=+214.257489125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.742133 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zwfns" podStartSLOduration=9.742110479 podStartE2EDuration="9.742110479s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.71963948 +0000 UTC m=+213.740755371" watchObservedRunningTime="2026-03-13 20:31:22.742110479 +0000 UTC m=+213.763226370" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.756981 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" event={"ID":"3635b091-f7bf-4c6d-bb7a-5723b36f990f","Type":"ContainerStarted","Data":"ddec58bb3589a072b686abb8229af2b81ca1aa84380efb2669cc93395f70497a"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.771961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" event={"ID":"3baed13c-c4c1-4fc2-9527-bfd2273efbbb","Type":"ContainerStarted","Data":"d5240bb98341498587525757f6e4ac55183c53771ddc0ee8c8ca6abf568dec2f"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.775818 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"09cce78a-6bee-4201-82d7-a4e0dd041c9f","Type":"ContainerStarted","Data":"78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.808131 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" podStartSLOduration=167.808106499 podStartE2EDuration="2m47.808106499s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.801114699 +0000 UTC m=+213.822230600" watchObservedRunningTime="2026-03-13 20:31:22.808106499 +0000 UTC m=+213.829222410" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.837325 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.837480 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.337451783 +0000 UTC m=+214.358567674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.837934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.838309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" event={"ID":"0bc71397-bb77-45b3-92c4-77710458d4fe","Type":"ContainerStarted","Data":"49c5d026eb64b4d5647200c969b2b62605404470459ddf55f9ef06a7895500c5"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.839810 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34018: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.840093 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.340081715 +0000 UTC m=+214.361197606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.853571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" event={"ID":"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55","Type":"ContainerStarted","Data":"da1d931701622f01a7bf5ec6ba53ffdf4132f55b9cab7471ac5b57db237b6ce3"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.867067 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.884213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" event={"ID":"71ed135e-3db4-4f03-a89e-f82bc3cf0b34","Type":"ContainerStarted","Data":"898f5a100df4238843c8507dc6c1ed32963c9d645ec72bd78f8daa0e5433a92b"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.884247 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" event={"ID":"71ed135e-3db4-4f03-a89e-f82bc3cf0b34","Type":"ContainerStarted","Data":"9a6fe40383d5e82d52a9783b6b1e61846ba488ba17a8a6e15b807ad841fc746e"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885749 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jnbzb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885763 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885788 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885821 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.889155 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" podStartSLOduration=167.889139955 podStartE2EDuration="2m47.889139955s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.878318522 +0000 UTC m=+213.899434423" watchObservedRunningTime="2026-03-13 20:31:22.889139955 +0000 UTC m=+213.910255846" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.897995 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.929605 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.930024 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34034: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.941215 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.943982 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.443955491 +0000 UTC m=+214.465071392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: W0313 20:31:22.964634 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3 WatchSource:0}: Error finding container e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3: Status 404 returned error can't find the container with id e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3 Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.044081 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.044745 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.544728982 +0000 UTC m=+214.565844873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.057488 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" podStartSLOduration=168.057465827 podStartE2EDuration="2m48.057465827s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:23.010923605 +0000 UTC m=+214.032039516" watchObservedRunningTime="2026-03-13 20:31:23.057465827 +0000 UTC m=+214.078581728" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.083160 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34038: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.095754 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mnf26"] Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.145192 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.145456 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.645438442 +0000 UTC m=+214.666554333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.248112 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34052: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.249141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.249574 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.749553925 +0000 UTC m=+214.770669806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.334643 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34066: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.357560 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.357968 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.857949502 +0000 UTC m=+214.879065393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.462090 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.462460 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.962447635 +0000 UTC m=+214.983563526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.562953 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.563227 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.063206676 +0000 UTC m=+215.084322567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.655586 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34074: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.663916 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.664239 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.164227125 +0000 UTC m=+215.185343026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.710521 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:23 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:23 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:23 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.710579 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.765649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.765847 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.265817138 +0000 UTC m=+215.286933029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.765956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.766327 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.266312281 +0000 UTC m=+215.287428172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.866518 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.866670 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.366646551 +0000 UTC m=+215.387762452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.866734 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.867058 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.367046962 +0000 UTC m=+215.388162873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.906200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mnf26" event={"ID":"c54336a0-5a12-4bf9-9807-337dd352fdb6","Type":"ContainerStarted","Data":"0b6794c9fe65f322c28666c06c92a498ea123712d515a209d34b5f14547c9762"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.906265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mnf26" event={"ID":"c54336a0-5a12-4bf9-9807-337dd352fdb6","Type":"ContainerStarted","Data":"a10f7b028aa807afab8b4ec493ac5112136565647beb352116cc09abd06040c3"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.907621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"76a39aadffb3c2d5cb1dd7bb82e2424e525eeb3d16ec6e80c6d388ce2f9367ba"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.907648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.908532 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.935627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5a9565f34a0cfe251d0647d382da9853c80accd576c3333cc9f51fa4ddfb07ad"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.935687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"140f781797bfadbb905fa57351522884427bbf6f5f7df4cc6c210e4bf8aa60dd"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.939662 4790 generic.go:334] "Generic (PLEG): container finished" podID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerID="835625bd5b25d33532e6f1a4c1701e10e108eacd1b2af7b16c3e421ede1a0acf" exitCode=0 Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.939792 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"09cce78a-6bee-4201-82d7-a4e0dd041c9f","Type":"ContainerDied","Data":"835625bd5b25d33532e6f1a4c1701e10e108eacd1b2af7b16c3e421ede1a0acf"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.947666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8f68ac2f7863be3453e9957cc4344e636b513ffff4577dc51f946700572b6683"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.947719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f121558a4493e216aae0979d701d802330aaa8cc4d4d43b6a5a8d79a4a7f6c7d"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.952531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"35442addd7079011ecf48aca388391d4e0f716842b0ac5563911482db0c7ab7d"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.971577 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.972193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.978533 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.478499133 +0000 UTC m=+215.499615024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.073667 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.076936 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.5769204 +0000 UTC m=+215.598036291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.179227 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.179649 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.679629915 +0000 UTC m=+215.700745806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.281228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.281635 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.781619449 +0000 UTC m=+215.802735340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.383087 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.383277 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.883247133 +0000 UTC m=+215.904363024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.383358 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.383734 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.883718846 +0000 UTC m=+215.904834737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.484206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.484364 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.984345644 +0000 UTC m=+216.005461535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.484477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.484748 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.984739485 +0000 UTC m=+216.005855376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.529085 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.530110 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.532059 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.532991 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.552889 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.555894 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585538 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.585927 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.085909127 +0000 UTC m=+216.107025018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687277 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687340 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687392 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.689735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.690058 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.19004298 +0000 UTC m=+216.211158971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.806601 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:24 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:24 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:24 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.806659 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.809213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.809601 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.309585399 +0000 UTC m=+216.330701300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.811059 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.812534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.815552 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.815772 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.831075 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.853591 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911654 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911676 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.911976 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.411960945 +0000 UTC m=+216.433076826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.929343 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.933253 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.948397 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.973533 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34076: no serving certificate available for the kubelet" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.012919 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.013143 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.513086035 +0000 UTC m=+216.534201936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013572 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013698 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.014325 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.514306149 +0000 UTC m=+216.535422140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.017608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.017885 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.041249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114767 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114924 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114951 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114979 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.116217 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.616202401 +0000 UTC m=+216.637318292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.137867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.142006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.144323 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.148736 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.150529 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.151927 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.186815 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.187089 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" containerID="cri-o://b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83" gracePeriod=30 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.199752 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.233230 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.233721 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.733707126 +0000 UTC m=+216.754823017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.244088 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.244364 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" containerID="cri-o://b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d" gracePeriod=30 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.259768 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334949 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334966 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.337618 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.837597232 +0000 UTC m=+216.858713123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.373208 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436433 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.437556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.437888 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.93787347 +0000 UTC m=+216.958989361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.438336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.468226 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.498832 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.509607 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.537977 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "09cce78a-6bee-4201-82d7-a4e0dd041c9f" (UID: "09cce78a-6bee-4201-82d7-a4e0dd041c9f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.538324 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.038307442 +0000 UTC m=+217.059423333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538520 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.538817 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.038805186 +0000 UTC m=+217.059921077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.542148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "09cce78a-6bee-4201-82d7-a4e0dd041c9f" (UID: "09cce78a-6bee-4201-82d7-a4e0dd041c9f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.646600 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.647186 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.1470989 +0000 UTC m=+217.168214801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.647299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.647955 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.648769 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.148686344 +0000 UTC m=+217.169802235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.680062 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.709810 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:25 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:25 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:25 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.709856 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.750952 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.751356 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.251331077 +0000 UTC m=+217.272446968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.856157 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.856766 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.356750874 +0000 UTC m=+217.377866765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.958730 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.959152 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.459132759 +0000 UTC m=+217.480248650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.962479 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:31:25 crc kubenswrapper[4790]: W0313 20:31:25.977710 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda03af74_8c59_4ccf_aff8_03dc6303e322.slice/crio-9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760 WatchSource:0}: Error finding container 9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760: Status 404 returned error can't find the container with id 9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.979130 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.979225 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"09cce78a-6bee-4201-82d7-a4e0dd041c9f","Type":"ContainerDied","Data":"78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.979253 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.984023 4790 generic.go:334] "Generic (PLEG): container finished" podID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerID="b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83" exitCode=0 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.984091 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerDied","Data":"b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.991827 4790 generic.go:334] "Generic (PLEG): container finished" podID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" exitCode=0 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.991902 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.991929 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerStarted","Data":"073e407a9eaa46913e8a833719c1712b0b191e45db2255328c3b799329f32f02"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.996949 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mnf26" event={"ID":"c54336a0-5a12-4bf9-9807-337dd352fdb6","Type":"ContainerStarted","Data":"a709e96657b309a9665690d02b5c7d5c73211093a04a9611a54e061de11d9da2"} Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.010682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerStarted","Data":"5bff08277bee799461658bd86530c13fa744a49d2daab25cbda9f9c23ac16aa2"} Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.018610 4790 generic.go:334] "Generic (PLEG): container finished" podID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerID="b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d" exitCode=0 Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.018651 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerDied","Data":"b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d"} Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.034461 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mnf26" podStartSLOduration=171.03444391 podStartE2EDuration="2m51.03444391s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:26.032838647 +0000 UTC m=+217.053954538" watchObservedRunningTime="2026-03-13 20:31:26.03444391 +0000 UTC m=+217.055559801" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.054231 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.060159 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.061193 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.561179935 +0000 UTC m=+217.582295826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.154150 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.162535 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.163218 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.66318994 +0000 UTC m=+217.684305831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.191472 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.193981 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.193911 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.194231 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.256016 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264667 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264843 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.265050 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.268323 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.768308119 +0000 UTC m=+217.789424010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.272898 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca" (OuterVolumeSpecName: "client-ca") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.273504 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config" (OuterVolumeSpecName: "config") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.274031 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.278841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx" (OuterVolumeSpecName: "kube-api-access-fbpwx") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "kube-api-access-fbpwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.280038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.295182 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.356632 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.357966 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.358040 4790 patch_prober.go:28] interesting pod/console-f9d7485db-q5j7f container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.358094 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q5j7f" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.366244 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.866218913 +0000 UTC m=+217.887334834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366343 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366822 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366842 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366852 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366861 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366870 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.367844 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.867834687 +0000 UTC m=+217.888950578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.368118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.368216 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config" (OuterVolumeSpecName: "config") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.371821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl" (OuterVolumeSpecName: "kube-api-access-k6xvl") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "kube-api-access-k6xvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.371940 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.375663 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.376530 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.386536 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468006 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.468306 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.96828738 +0000 UTC m=+217.989403271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468483 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468507 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468519 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468531 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.569647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.570010 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.069991847 +0000 UTC m=+218.091107748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.675199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.675572 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.175545308 +0000 UTC m=+218.196661229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.701357 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.707634 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:26 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:26 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:26 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.707698 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.730840 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.731052 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerName="pruner" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731065 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerName="pruner" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.731081 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731088 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.731102 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731110 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731191 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerName="pruner" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731204 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731213 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731917 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.734040 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.745048 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.776895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.778054 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.278041696 +0000 UTC m=+218.299157577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.877935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.878292 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.378256833 +0000 UTC m=+218.399372724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878505 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878534 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878635 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.878909 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.37889799 +0000 UTC m=+218.400013871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979329 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979507 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.979535 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.479509647 +0000 UTC m=+218.500625538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979582 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.980200 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.480184555 +0000 UTC m=+218.501300446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.980247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.980288 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.016255 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.034659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"aa535420608f709689b7d99354c2ca7e3de7feac56c2bea20ab5aa4a2cb8cb0d"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.041208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerDied","Data":"261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.041276 4790 scope.go:117] "RemoveContainer" containerID="b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.041346 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.045821 4790 generic.go:334] "Generic (PLEG): container finished" podID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" exitCode=0 Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.045894 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.048658 4790 generic.go:334] "Generic (PLEG): container finished" podID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerID="33326be198fd78688d8c0e82df3982727cfbc7e94ef4969d1503af495b1859ed" exitCode=0 Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.048802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"33326be198fd78688d8c0e82df3982727cfbc7e94ef4969d1503af495b1859ed"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.048868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerStarted","Data":"1f3bbc4d7d37e2d400e1366f116e79095d38ddf23a471dd30cc3d7e41c04740d"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.059139 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerDied","Data":"99cf4ef26fb9eb5a3a40ad496b60c26b191859906bd206806ca175b1e727b6b2"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.059179 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.066361 4790 generic.go:334] "Generic (PLEG): container finished" podID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerID="18d45729b57b0625b6ac059bc91aedd72d39045472cf08d5152f47c470f71f43" exitCode=0 Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.066731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"18d45729b57b0625b6ac059bc91aedd72d39045472cf08d5152f47c470f71f43"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.066762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerStarted","Data":"9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.078955 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.080220 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.087565 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.088665 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.588636155 +0000 UTC m=+218.609752066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.103773 4790 scope.go:117] "RemoveContainer" containerID="b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.109458 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.124498 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.127770 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.131037 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.160592 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.201957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.203370 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.703357765 +0000 UTC m=+218.724473646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.209718 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.246502 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.253695 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.254365 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.257997 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.259068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.259996 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260100 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260488 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260514 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260963 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.261085 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.287512 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288016 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288573 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288896 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.289285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.289550 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.306776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.307100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.307139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.307285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.307457 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.807437776 +0000 UTC m=+218.828553667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.315064 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.315439 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410670 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410771 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410827 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410847 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410908 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.412069 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.412573 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.912562876 +0000 UTC m=+218.933678767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.412932 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.449146 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.482007 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.493963 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.495201 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.498681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.498904 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.512907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.512994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513048 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513066 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513125 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513175 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513195 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.513470 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.013454971 +0000 UTC m=+219.034570852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.516186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.517553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.518607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.519556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.519805 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.522524 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.528185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.553258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.555246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.555927 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34086: no serving certificate available for the kubelet" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.561242 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.601776 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34100: no serving certificate available for the kubelet" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.617952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.618099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.618233 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.619067 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.119049542 +0000 UTC m=+219.140165443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.619503 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.660936 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.673174 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" path="/var/lib/kubelet/pods/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9/volumes" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.674215 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" path="/var/lib/kubelet/pods/869d7601-27fe-4a6a-840b-a9811c4d1e06/volumes" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.685370 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.708614 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:27 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:27 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:27 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.708666 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.719200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.719622 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.219602538 +0000 UTC m=+219.240718429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.722467 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.737434 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.740929 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.742846 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.747641 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.757305 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821537 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821648 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821707 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.822106 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.322091616 +0000 UTC m=+219.343207507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.837516 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.922679 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923177 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923296 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.924073 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.42405571 +0000 UTC m=+219.445171611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.924362 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.952077 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.026120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.026490 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.526476106 +0000 UTC m=+219.547591997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.053055 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.054790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.074547 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.082217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"e381ee8bd664d5619b0e6d2a4c827fec9aa9a14fcff37d85a821534359f1ff27"} Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.093564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerStarted","Data":"48ce1cd0515d2f72905d7c3b45c89c2baec4ecf2f36741a13ea570b7bf830ee2"} Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.123277 4790 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.130925 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.132214 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.632188752 +0000 UTC m=+219.653304683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.134497 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.135787 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.147410 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232252 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.232658 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.732645174 +0000 UTC m=+219.753761135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.243933 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.333969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.334544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.334573 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.334651 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.335339 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.335635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.336196 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.83617575 +0000 UTC m=+219.857291661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.394591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.436014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.436323 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.936311775 +0000 UTC m=+219.957427666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.509761 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.541194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.541561 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.041542518 +0000 UTC m=+220.062658409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.557680 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.643494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.643888 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.143872671 +0000 UTC m=+220.164988562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.663880 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.711011 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:28 crc kubenswrapper[4790]: [+]has-synced ok Mar 13 20:31:28 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:28 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.711074 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.746242 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.747152 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.2471177 +0000 UTC m=+220.268233601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.849422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.850473 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.350453141 +0000 UTC m=+220.371569032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.904808 4790 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T20:31:28.12331277Z","Handler":null,"Name":""} Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.905926 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.907941 4790 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.907977 4790 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.950659 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: W0313 20:31:28.990093 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa0c26b_aef8_49e9_9904_da9e8d029c9d.slice/crio-050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa WatchSource:0}: Error finding container 050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa: Status 404 returned error can't find the container with id 050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.993055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.052309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.055218 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.055265 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.137232 4790 generic.go:334] "Generic (PLEG): container finished" podID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerID="93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a" exitCode=0 Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.137318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerDied","Data":"93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.139112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerStarted","Data":"050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.143262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"eaace0a3c5e5a3b8c9c6cbe3e7cb57115efee8da33f517b47b08d24616047a6a"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.144816 4790 generic.go:334] "Generic (PLEG): container finished" podID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerID="81d571ea6f444235cc217ca2f76bd3ade803e952dcea7fa197b363c62b207fc9" exitCode=0 Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.144883 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"81d571ea6f444235cc217ca2f76bd3ade803e952dcea7fa197b363c62b207fc9"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.149548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.151840 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerStarted","Data":"c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.151890 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerStarted","Data":"db98d9a2f6fdbbe8abd9aaa1bfbfc3ef07a5ef170be435e5a3e26c5a2b07958a"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.168117 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.169556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerStarted","Data":"fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.169598 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerStarted","Data":"e64961300406621b1586951d2f5b3f6e49675a9edac316b75dde99552bd4b189"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.172548 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.173729 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerID="afed47472efd96d5fb96f1be65a82143aad59afc7569141f603e4362a1d44b0e" exitCode=0 Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.173784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"afed47472efd96d5fb96f1be65a82143aad59afc7569141f603e4362a1d44b0e"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.173803 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerStarted","Data":"53cd4a75ebfee1686f2db1e566581c31a9b03470b4313025f3a980087eb27a00"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.175969 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1ce5c74b-2f06-4910-92b5-54abaa46ab8b","Type":"ContainerStarted","Data":"6aa52f928dfcf20043e1e8907fdf7b8ac7fc0fee297941db822b3adf39ff7e52"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.177585 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerStarted","Data":"5433561752fd3b8f83751ddd33926ccfe479acc64fdf830adcad528290d813de"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.209814 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podStartSLOduration=4.209797631 podStartE2EDuration="4.209797631s" podCreationTimestamp="2026-03-13 20:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:29.208556927 +0000 UTC m=+220.229672818" watchObservedRunningTime="2026-03-13 20:31:29.209797631 +0000 UTC m=+220.230913522" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.252207 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.291523 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" podStartSLOduration=16.291502775 podStartE2EDuration="16.291502775s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:29.286803008 +0000 UTC m=+220.307918899" watchObservedRunningTime="2026-03-13 20:31:29.291502775 +0000 UTC m=+220.312618666" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.342550 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podStartSLOduration=4.342532019 podStartE2EDuration="4.342532019s" podCreationTimestamp="2026-03-13 20:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:29.341625404 +0000 UTC m=+220.362741295" watchObservedRunningTime="2026-03-13 20:31:29.342532019 +0000 UTC m=+220.363647910" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.456176 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.459707 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.722451 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.749245 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.758169 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.839096 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:31:29 crc kubenswrapper[4790]: W0313 20:31:29.841991 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81949470_5c0d_4294_8618_d6ee14da1d41.slice/crio-403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6 WatchSource:0}: Error finding container 403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6: Status 404 returned error can't find the container with id 403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.185506 4790 generic.go:334] "Generic (PLEG): container finished" podID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerID="baf477fae6f589d57e0060deeedd270c70f1976e7a6063c5a8ff425709bdf2d2" exitCode=0 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.185590 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1ce5c74b-2f06-4910-92b5-54abaa46ab8b","Type":"ContainerDied","Data":"baf477fae6f589d57e0060deeedd270c70f1976e7a6063c5a8ff425709bdf2d2"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.186412 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerStarted","Data":"403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.187712 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerID="f276b163ccc0d21403b49d02b3c506a94213d0bcc943d5fcede3603bc020ebfc" exitCode=0 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.187984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"f276b163ccc0d21403b49d02b3c506a94213d0bcc943d5fcede3603bc020ebfc"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.191872 4790 generic.go:334] "Generic (PLEG): container finished" podID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerID="4ccfbd25425ce912c32c0f73aa49b376929e5a036b5718d87d565520eab1f4ab" exitCode=0 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.192010 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"4ccfbd25425ce912c32c0f73aa49b376929e5a036b5718d87d565520eab1f4ab"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.506091 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.693920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.694028 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.694122 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.694911 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume" (OuterVolumeSpecName: "config-volume") pod "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" (UID: "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.705166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7" (OuterVolumeSpecName: "kube-api-access-d66s7") pod "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" (UID: "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51"). InnerVolumeSpecName "kube-api-access-d66s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.705216 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" (UID: "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.795426 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.795460 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.795470 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.206856 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerDied","Data":"1809f43b88080170a440a364505c4febd360a062e9e4aabd772262f808d67b1c"} Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.206897 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.206900 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1809f43b88080170a440a364505c4febd360a062e9e4aabd772262f808d67b1c" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.210547 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerStarted","Data":"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb"} Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.210683 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.253793 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" podStartSLOduration=176.253776124 podStartE2EDuration="2m56.253776124s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:31.251026919 +0000 UTC m=+222.272142810" watchObservedRunningTime="2026-03-13 20:31:31.253776124 +0000 UTC m=+222.274892015" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.585088 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.586596 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.711222 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.711304 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.712146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1ce5c74b-2f06-4910-92b5-54abaa46ab8b" (UID: "1ce5c74b-2f06-4910-92b5-54abaa46ab8b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.724427 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1ce5c74b-2f06-4910-92b5-54abaa46ab8b" (UID: "1ce5c74b-2f06-4910-92b5-54abaa46ab8b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.812751 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.812788 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.218416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1ce5c74b-2f06-4910-92b5-54abaa46ab8b","Type":"ContainerDied","Data":"6aa52f928dfcf20043e1e8907fdf7b8ac7fc0fee297941db822b3adf39ff7e52"} Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.218467 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa52f928dfcf20043e1e8907fdf7b8ac7fc0fee297941db822b3adf39ff7e52" Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.218441 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.709589 4790 ???:1] "http: TLS handshake error from 192.168.126.11:36548: no serving certificate available for the kubelet" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191406 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191425 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191465 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191470 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.363251 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.374887 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.017565 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.017921 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.791064 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.791304 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" containerID="cri-o://c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618" gracePeriod=30 Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.810026 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.810276 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" containerID="cri-o://fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed" gracePeriod=30 Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.712492 4790 generic.go:334] "Generic (PLEG): container finished" podID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerID="c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618" exitCode=0 Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.712595 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerDied","Data":"c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618"} Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.714553 4790 generic.go:334] "Generic (PLEG): container finished" podID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerID="fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed" exitCode=0 Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.714605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerDied","Data":"fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed"} Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191634 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191715 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191651 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191820 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.192351 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.192510 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.193530 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac"} pod="openshift-console/downloads-7954f5f757-zfhhl" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.193582 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" containerID="cri-o://e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac" gracePeriod=2 Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.686681 4790 patch_prober.go:28] interesting pod/route-controller-manager-69fc968766-v5gfg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.687013 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.727732 4790 generic.go:334] "Generic (PLEG): container finished" podID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerID="e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac" exitCode=0 Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.727782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerDied","Data":"e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac"} Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.742534 4790 patch_prober.go:28] interesting pod/controller-manager-8687f458cd-h5svs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.742603 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 13 20:31:49 crc kubenswrapper[4790]: I0313 20:31:49.467369 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:53 crc kubenswrapper[4790]: I0313 20:31:53.220833 4790 ???:1] "http: TLS handshake error from 192.168.126.11:49802: no serving certificate available for the kubelet" Mar 13 20:31:56 crc kubenswrapper[4790]: I0313 20:31:56.190994 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:56 crc kubenswrapper[4790]: I0313 20:31:56.191331 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:56 crc kubenswrapper[4790]: I0313 20:31:56.390807 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.687288 4790 patch_prober.go:28] interesting pod/route-controller-manager-69fc968766-v5gfg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.687640 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.742210 4790 patch_prober.go:28] interesting pod/controller-manager-8687f458cd-h5svs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.742265 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471060 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 20:31:58 crc kubenswrapper[4790]: E0313 20:31:58.471283 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerName="pruner" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471294 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerName="pruner" Mar 13 20:31:58 crc kubenswrapper[4790]: E0313 20:31:58.471305 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerName="collect-profiles" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471310 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerName="collect-profiles" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471420 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerName="collect-profiles" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471433 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerName="pruner" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.472350 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.474705 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.475893 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.484077 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.514321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.514446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.615451 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.615543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.615562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.634564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.794854 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.132737 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.134100 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.136214 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.140139 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.234739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"auto-csr-approver-29557232-bblq8\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.335975 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"auto-csr-approver-29557232-bblq8\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.354481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"auto-csr-approver-29557232-bblq8\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.449613 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:01 crc kubenswrapper[4790]: I0313 20:32:01.918467 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.467440 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.468077 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.487751 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.565925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.566251 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.566479 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.667914 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.667981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.668006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.668028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.668067 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.688224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.795888 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:06 crc kubenswrapper[4790]: I0313 20:32:06.192212 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:06 crc kubenswrapper[4790]: I0313 20:32:06.192542 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.686789 4790 patch_prober.go:28] interesting pod/route-controller-manager-69fc968766-v5gfg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.687102 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.742743 4790 patch_prober.go:28] interesting pod/controller-manager-8687f458cd-h5svs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.742843 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:32:12 crc kubenswrapper[4790]: E0313 20:32:12.153173 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage238052257/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 20:32:12 crc kubenswrapper[4790]: E0313 20:32:12.153692 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbqp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bq4pj_openshift-marketplace(e17d5bd1-f368-47a4-80cb-3bd3eb4b822c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage238052257/2\": happened during read: context canceled" logger="UnhandledError" Mar 13 20:32:12 crc kubenswrapper[4790]: E0313 20:32:12.154891 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage238052257/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bq4pj" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.016885 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.016943 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.016986 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.017961 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.018019 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400" gracePeriod=600 Mar 13 20:32:14 crc kubenswrapper[4790]: E0313 20:32:14.884338 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bq4pj" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.924548 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.929032 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954245 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954323 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954401 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954454 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954484 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954559 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954609 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.955563 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.958831 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config" (OuterVolumeSpecName: "config") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.959854 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca" (OuterVolumeSpecName: "client-ca") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.960288 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.962565 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26" (OuterVolumeSpecName: "kube-api-access-cbn26") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "kube-api-access-cbn26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.962649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config" (OuterVolumeSpecName: "config") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.962884 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.963986 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:14 crc kubenswrapper[4790]: E0313 20:32:14.964261 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964273 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: E0313 20:32:14.964283 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964290 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964483 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964936 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.965032 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.966724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.967957 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5" (OuterVolumeSpecName: "kube-api-access-797h5") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "kube-api-access-797h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.976652 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055958 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055973 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055986 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055995 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056004 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056012 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056021 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056029 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056036 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.057632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerDied","Data":"db98d9a2f6fdbbe8abd9aaa1bfbfc3ef07a5ef170be435e5a3e26c5a2b07958a"} Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.057665 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.057690 4790 scope.go:117] "RemoveContainer" containerID="c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.060899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerDied","Data":"e64961300406621b1586951d2f5b3f6e49675a9edac316b75dde99552bd4b189"} Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.061002 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.064493 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400" exitCode=0 Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.064525 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400"} Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.090759 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.093299 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.101116 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.104281 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156706 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156854 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.157886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.158097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.158186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.172476 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.180649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.312703 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.666778 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" path="/var/lib/kubelet/pods/04038bbe-4cc0-4d19-80d7-f86cdffda1d5/volumes" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.667476 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" path="/var/lib/kubelet/pods/ff385bac-0b93-4dc8-b8bc-ef1b4986649b/volumes" Mar 13 20:32:16 crc kubenswrapper[4790]: I0313 20:32:16.192164 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:16 crc kubenswrapper[4790]: I0313 20:32:16.192261 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.684631 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.685769 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.687965 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.687999 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.688143 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.688824 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.690262 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.691371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.694071 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791254 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791305 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.893485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.894022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.901396 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.908867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:18 crc kubenswrapper[4790]: I0313 20:32:18.043015 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.105159 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.105957 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5fzv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mf4tm_openshift-marketplace(f1be7d98-ff3a-42bb-b8ff-4001814ae453): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\": context canceled" logger="UnhandledError" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.107245 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-mf4tm" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.387983 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.388152 4790 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 20:32:20 crc kubenswrapper[4790]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 20:32:20 crc kubenswrapper[4790]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hb9zk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29557230-8pqh8_openshift-infra(d598b7c0-7c77-4903-9138-d8a3d01f9efe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 20:32:20 crc kubenswrapper[4790]: > logger="UnhandledError" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.389398 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" Mar 13 20:32:21 crc kubenswrapper[4790]: E0313 20:32:21.100596 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" Mar 13 20:32:21 crc kubenswrapper[4790]: E0313 20:32:21.649305 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mf4tm" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" Mar 13 20:32:23 crc kubenswrapper[4790]: E0313 20:32:23.259720 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 20:32:23 crc kubenswrapper[4790]: E0313 20:32:23.260438 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hskct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-txx64_openshift-marketplace(7080e6b3-5934-4c2c-9361-23d20b5a495e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:23 crc kubenswrapper[4790]: E0313 20:32:23.261656 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-txx64" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" Mar 13 20:32:26 crc kubenswrapper[4790]: I0313 20:32:26.191282 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:26 crc kubenswrapper[4790]: I0313 20:32:26.191628 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.385540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-txx64" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.455674 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.455828 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhj8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fxjp7_openshift-marketplace(4aa0c26b-aef8-49e9-9904-da9e8d029c9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.457566 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.480312 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.480463 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwk57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-df8gv_openshift-marketplace(da03af74-8c59-4ccf-aff8-03dc6303e322): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.481761 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-df8gv" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.162420 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-df8gv" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.171072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.211759 4790 scope.go:117] "RemoveContainer" containerID="fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.249740 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.250676 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.250908 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhkbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-672cv_openshift-marketplace(dbee8a79-e625-49ef-8fcb-944341ae6e37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.252135 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-672cv" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.253333 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dmtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5tr4n_openshift-marketplace(446f0f4c-a97c-47d0-929d-0b99e07c8186): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.255006 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5tr4n" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.456223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.508161 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.747810 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.861969 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.868177 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:28 crc kubenswrapper[4790]: W0313 20:32:28.883870 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25fd28fa_57e3_41b6_8329_693cbfb20e89.slice/crio-d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d WatchSource:0}: Error finding container d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d: Status 404 returned error can't find the container with id d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.147444 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerStarted","Data":"541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.147816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerStarted","Data":"d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.149655 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.155187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerStarted","Data":"cdca48635a5083c3a3adb08d1d13d2bc3dcf5e76b82c79cc4754522c8cfa7f45"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.156139 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.159344 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.159462 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.160531 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.162149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerStarted","Data":"9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.162182 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerStarted","Data":"6227ca10032f20f7061333e184a7f5fd825e11a98a57054176baad69903a3e6d"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.163246 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.181048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.184656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerStarted","Data":"73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.186423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerStarted","Data":"9f6af6894778163383ed6bc7ed4bee995281a37fdd644ab6915400ceabaa99c9"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.186488 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerStarted","Data":"7c45eb7619c8e10226f3c5dac1f003594c20c32d5deea5cafda395f8e88d886e"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.192404 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" podStartSLOduration=26.192362314 podStartE2EDuration="26.192362314s" podCreationTimestamp="2026-03-13 20:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.173019839 +0000 UTC m=+280.194135730" watchObservedRunningTime="2026-03-13 20:32:29.192362314 +0000 UTC m=+280.213478205" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.200185 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerStarted","Data":"a23eb85d97b1e4751bafccab0781c9447925014836e41ef0f17d54c7448721b2"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.215051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerStarted","Data":"efd1d06a6ce25e4e3fca34226ca853275cb494e1f7d417b592640cdbae34182e"} Mar 13 20:32:29 crc kubenswrapper[4790]: E0313 20:32:29.217858 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-672cv" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" Mar 13 20:32:29 crc kubenswrapper[4790]: E0313 20:32:29.218113 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5tr4n" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.260592 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" podStartSLOduration=25.260571604 podStartE2EDuration="25.260571604s" podCreationTimestamp="2026-03-13 20:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.258297682 +0000 UTC m=+280.279413573" watchObservedRunningTime="2026-03-13 20:32:29.260571604 +0000 UTC m=+280.281687495" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.330614 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=27.330591834 podStartE2EDuration="27.330591834s" podCreationTimestamp="2026-03-13 20:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.327641714 +0000 UTC m=+280.348757615" watchObservedRunningTime="2026-03-13 20:32:29.330591834 +0000 UTC m=+280.351707725" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.385070 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=31.385046222 podStartE2EDuration="31.385046222s" podCreationTimestamp="2026-03-13 20:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.382820711 +0000 UTC m=+280.403936602" watchObservedRunningTime="2026-03-13 20:32:29.385046222 +0000 UTC m=+280.406162113" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.549061 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.225018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerStarted","Data":"7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.227347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerStarted","Data":"58f159651637d3217394d3f34d5549bae6158dd0fd270cdccfb0e48c45bc1c2d"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.229493 4790 generic.go:334] "Generic (PLEG): container finished" podID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerID="73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6" exitCode=0 Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.229572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.230730 4790 generic.go:334] "Generic (PLEG): container finished" podID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerID="9f6af6894778163383ed6bc7ed4bee995281a37fdd644ab6915400ceabaa99c9" exitCode=0 Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.231244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerDied","Data":"9f6af6894778163383ed6bc7ed4bee995281a37fdd644ab6915400ceabaa99c9"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.232908 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.232946 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.581527 4790 csr.go:261] certificate signing request csr-zhfvx is approved, waiting to be issued Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.589508 4790 csr.go:257] certificate signing request csr-zhfvx is issued Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.241707 4790 generic.go:334] "Generic (PLEG): container finished" podID="b190462f-7836-44f0-94c0-1311bdf8e550" containerID="7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d" exitCode=0 Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.241775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerDied","Data":"7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d"} Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.243587 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.243641 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.568273 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.587974 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.588073 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.589109 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" (UID: "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.594509 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 21:43:25.708006275 +0000 UTC Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.594775 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6481h10m54.113234091s for next certificate rotation Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.595771 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" (UID: "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.689618 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.689868 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.248449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerDied","Data":"7c45eb7619c8e10226f3c5dac1f003594c20c32d5deea5cafda395f8e88d886e"} Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.248501 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c45eb7619c8e10226f3c5dac1f003594c20c32d5deea5cafda395f8e88d886e" Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.248464 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.595705 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 22:34:01.937259568 +0000 UTC Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.596040 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6482h1m29.341225409s for next certificate rotation Mar 13 20:32:33 crc kubenswrapper[4790]: I0313 20:32:33.917811 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.034225 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"b190462f-7836-44f0-94c0-1311bdf8e550\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.041454 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz" (OuterVolumeSpecName: "kube-api-access-jpxhz") pod "b190462f-7836-44f0-94c0-1311bdf8e550" (UID: "b190462f-7836-44f0-94c0-1311bdf8e550"). InnerVolumeSpecName "kube-api-access-jpxhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.135548 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.264090 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.264712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerDied","Data":"a23eb85d97b1e4751bafccab0781c9447925014836e41ef0f17d54c7448721b2"} Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.264767 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a23eb85d97b1e4751bafccab0781c9447925014836e41ef0f17d54c7448721b2" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.266958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerStarted","Data":"85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af"} Mar 13 20:32:35 crc kubenswrapper[4790]: I0313 20:32:35.296485 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hnd2l" podStartSLOduration=4.6210440550000005 podStartE2EDuration="1m8.296467045s" podCreationTimestamp="2026-03-13 20:31:27 +0000 UTC" firstStartedPulling="2026-03-13 20:31:30.194206684 +0000 UTC m=+221.215322575" lastFinishedPulling="2026-03-13 20:32:33.869629674 +0000 UTC m=+284.890745565" observedRunningTime="2026-03-13 20:32:35.294767989 +0000 UTC m=+286.315883880" watchObservedRunningTime="2026-03-13 20:32:35.296467045 +0000 UTC m=+286.317582936" Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.190964 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.191449 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.190995 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.191566 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.284748 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerStarted","Data":"fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206"} Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.299011 4790 generic.go:334] "Generic (PLEG): container finished" podID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerID="8f1a4232fe3ee20e22f3a57d7811b303dba4631c6cf2890a09449767842fc5b4" exitCode=0 Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.299332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" event={"ID":"d598b7c0-7c77-4903-9138-d8a3d01f9efe","Type":"ContainerDied","Data":"8f1a4232fe3ee20e22f3a57d7811b303dba4631c6cf2890a09449767842fc5b4"} Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.301021 4790 generic.go:334] "Generic (PLEG): container finished" podID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerID="fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206" exitCode=0 Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.301067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206"} Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.304055 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerID="912e4880b7a24b954e780b6b21a914866c6b2e2fd8684cf3dc798b5f59ce287f" exitCode=0 Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.304085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"912e4880b7a24b954e780b6b21a914866c6b2e2fd8684cf3dc798b5f59ce287f"} Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.075686 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.075810 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.779990 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.928276 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.934455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk" (OuterVolumeSpecName: "kube-api-access-hb9zk") pod "d598b7c0-7c77-4903-9138-d8a3d01f9efe" (UID: "d598b7c0-7c77-4903-9138-d8a3d01f9efe"). InnerVolumeSpecName "kube-api-access-hb9zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.037396 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.320559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" event={"ID":"d598b7c0-7c77-4903-9138-d8a3d01f9efe","Type":"ContainerDied","Data":"3851738f410766329c5133a13a2bdd38c600122354cde8d6b4c645c3b69815b7"} Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.320878 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3851738f410766329c5133a13a2bdd38c600122354cde8d6b4c645c3b69815b7" Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.320626 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:32:40 crc kubenswrapper[4790]: I0313 20:32:40.134861 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hnd2l" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" probeResult="failure" output=< Mar 13 20:32:40 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:32:40 crc kubenswrapper[4790]: > Mar 13 20:32:43 crc kubenswrapper[4790]: I0313 20:32:43.966034 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:43 crc kubenswrapper[4790]: I0313 20:32:43.966804 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" containerID="cri-o://541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9" gracePeriod=30 Mar 13 20:32:44 crc kubenswrapper[4790]: I0313 20:32:44.181195 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:44 crc kubenswrapper[4790]: I0313 20:32:44.181476 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" containerID="cri-o://9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153" gracePeriod=30 Mar 13 20:32:44 crc kubenswrapper[4790]: I0313 20:32:44.346986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerStarted","Data":"674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee"} Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.314983 4790 patch_prober.go:28] interesting pod/controller-manager-578f7cc4b8-ngnwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.315956 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.352661 4790 generic.go:334] "Generic (PLEG): container finished" podID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerID="541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9" exitCode=0 Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.352713 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerDied","Data":"541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9"} Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.354369 4790 generic.go:334] "Generic (PLEG): container finished" podID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerID="9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153" exitCode=0 Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.354638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerDied","Data":"9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153"} Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.373288 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bq4pj" podStartSLOduration=5.39208067 podStartE2EDuration="1m19.373269219s" podCreationTimestamp="2026-03-13 20:31:26 +0000 UTC" firstStartedPulling="2026-03-13 20:31:29.147130772 +0000 UTC m=+220.168246663" lastFinishedPulling="2026-03-13 20:32:43.128319321 +0000 UTC m=+294.149435212" observedRunningTime="2026-03-13 20:32:45.370286198 +0000 UTC m=+296.391402089" watchObservedRunningTime="2026-03-13 20:32:45.373269219 +0000 UTC m=+296.394385110" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.925339 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947334 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947577 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerName="pruner" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947588 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerName="pruner" Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947604 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947610 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947622 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947628 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947637 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947643 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947727 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947735 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947745 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947752 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerName="pruner" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.948087 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.968153 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.992456 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.079447 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080682 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080749 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080772 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080824 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080858 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080925 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081159 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081228 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081436 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config" (OuterVolumeSpecName: "config") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081484 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca" (OuterVolumeSpecName: "client-ca") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.082035 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca" (OuterVolumeSpecName: "client-ca") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.082057 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.082334 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config" (OuterVolumeSpecName: "config") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085238 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085312 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf" (OuterVolumeSpecName: "kube-api-access-v86xf") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "kube-api-access-v86xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc" (OuterVolumeSpecName: "kube-api-access-kvnkc") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "kube-api-access-kvnkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182044 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182304 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182319 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182332 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182344 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182356 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182367 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182405 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182423 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182436 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.184063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.184237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.184496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.190080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.198287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.209777 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.291399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.363086 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.363082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerDied","Data":"d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d"} Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.363227 4790 scope.go:117] "RemoveContainer" containerID="541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.365038 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.364826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerDied","Data":"6227ca10032f20f7061333e184a7f5fd825e11a98a57054176baad69903a3e6d"} Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.403985 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.408804 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.409805 4790 scope.go:117] "RemoveContainer" containerID="9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.425599 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.429842 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.700570 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:32:46 crc kubenswrapper[4790]: W0313 20:32:46.709792 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677b8903_b2f7_437f_a96d_f72d1ed30de5.slice/crio-e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b WatchSource:0}: Error finding container e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b: Status 404 returned error can't find the container with id e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.081285 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.081338 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.372348 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.376422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerStarted","Data":"65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692"} Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.379972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerStarted","Data":"e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b"} Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.667915 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" path="/var/lib/kubelet/pods/25fd28fa-57e3-41b6-8329-693cbfb20e89/volumes" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.668567 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" path="/var/lib/kubelet/pods/506dcf0c-8c65-486f-ac8d-e16ba9474095/volumes" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.155491 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.197946 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.386149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerStarted","Data":"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6"} Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.386816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.393954 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.409754 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mf4tm" podStartSLOduration=4.663511714 podStartE2EDuration="1m21.409737412s" podCreationTimestamp="2026-03-13 20:31:27 +0000 UTC" firstStartedPulling="2026-03-13 20:31:29.174766952 +0000 UTC m=+220.195882843" lastFinishedPulling="2026-03-13 20:32:45.92099265 +0000 UTC m=+296.942108541" observedRunningTime="2026-03-13 20:32:48.403742879 +0000 UTC m=+299.424858780" watchObservedRunningTime="2026-03-13 20:32:48.409737412 +0000 UTC m=+299.430853293" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.420837 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" podStartSLOduration=5.420821003 podStartE2EDuration="5.420821003s" podCreationTimestamp="2026-03-13 20:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:48.419154947 +0000 UTC m=+299.440270838" watchObservedRunningTime="2026-03-13 20:32:48.420821003 +0000 UTC m=+299.441936884" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.703682 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:32:48 crc kubenswrapper[4790]: E0313 20:32:48.703965 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.703982 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.704101 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.704603 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.706631 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.706645 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.708683 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.708887 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.718355 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.720598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.721848 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.815356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.815852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.815980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.816167 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917368 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917514 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.919240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.919445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.933923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.941123 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:49 crc kubenswrapper[4790]: I0313 20:32:49.024340 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.072565 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:32:54 crc kubenswrapper[4790]: W0313 20:32:54.080795 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa9c85a_2ba8_49ea_804e_f3b63b511642.slice/crio-0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd WatchSource:0}: Error finding container 0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd: Status 404 returned error can't find the container with id 0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.422667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerStarted","Data":"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.424434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerStarted","Data":"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.426310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerStarted","Data":"58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.428002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerStarted","Data":"69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.430785 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerStarted","Data":"324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.432361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerStarted","Data":"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.432526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerStarted","Data":"0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.433122 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.502059 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" podStartSLOduration=10.502038424 podStartE2EDuration="10.502038424s" podCreationTimestamp="2026-03-13 20:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:54.497360086 +0000 UTC m=+305.518475987" watchObservedRunningTime="2026-03-13 20:32:54.502038424 +0000 UTC m=+305.523154315" Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.876940 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.441047 4790 generic.go:334] "Generic (PLEG): container finished" podID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.441124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.445068 4790 generic.go:334] "Generic (PLEG): container finished" podID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerID="58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.445186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.450647 4790 generic.go:334] "Generic (PLEG): container finished" podID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerID="69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.450728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.454404 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerID="324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.454467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.461919 4790 generic.go:334] "Generic (PLEG): container finished" podID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.461995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.124983 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.474519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerStarted","Data":"2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.476282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerStarted","Data":"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.478507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerStarted","Data":"283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.480867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerStarted","Data":"934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.483267 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.483289 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.495213 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxjp7" podStartSLOduration=2.543039907 podStartE2EDuration="1m29.49518795s" podCreationTimestamp="2026-03-13 20:31:28 +0000 UTC" firstStartedPulling="2026-03-13 20:31:30.188874649 +0000 UTC m=+221.209990540" lastFinishedPulling="2026-03-13 20:32:57.141022692 +0000 UTC m=+308.162138583" observedRunningTime="2026-03-13 20:32:57.489830585 +0000 UTC m=+308.510946486" watchObservedRunningTime="2026-03-13 20:32:57.49518795 +0000 UTC m=+308.516303861" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.514537 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5tr4n" podStartSLOduration=3.561432804 podStartE2EDuration="1m33.514519445s" podCreationTimestamp="2026-03-13 20:31:24 +0000 UTC" firstStartedPulling="2026-03-13 20:31:27.053824332 +0000 UTC m=+218.074940223" lastFinishedPulling="2026-03-13 20:32:57.006910973 +0000 UTC m=+308.028026864" observedRunningTime="2026-03-13 20:32:57.511553875 +0000 UTC m=+308.532669766" watchObservedRunningTime="2026-03-13 20:32:57.514519445 +0000 UTC m=+308.535635346" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.520995 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.535556 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-df8gv" podStartSLOduration=2.446325159 podStartE2EDuration="1m32.535540006s" podCreationTimestamp="2026-03-13 20:31:25 +0000 UTC" firstStartedPulling="2026-03-13 20:31:27.103042345 +0000 UTC m=+218.124158246" lastFinishedPulling="2026-03-13 20:32:57.192257202 +0000 UTC m=+308.213373093" observedRunningTime="2026-03-13 20:32:57.53016701 +0000 UTC m=+308.551282901" watchObservedRunningTime="2026-03-13 20:32:57.535540006 +0000 UTC m=+308.556655907" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.547985 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-672cv" podStartSLOduration=2.650158367 podStartE2EDuration="1m33.547971013s" podCreationTimestamp="2026-03-13 20:31:24 +0000 UTC" firstStartedPulling="2026-03-13 20:31:26.000526991 +0000 UTC m=+217.021642882" lastFinishedPulling="2026-03-13 20:32:56.898339627 +0000 UTC m=+307.919455528" observedRunningTime="2026-03-13 20:32:57.546610005 +0000 UTC m=+308.567725896" watchObservedRunningTime="2026-03-13 20:32:57.547971013 +0000 UTC m=+308.569086904" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.489141 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerStarted","Data":"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723"} Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.516270 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-txx64" podStartSLOduration=4.210761461 podStartE2EDuration="1m34.516252264s" podCreationTimestamp="2026-03-13 20:31:24 +0000 UTC" firstStartedPulling="2026-03-13 20:31:27.048412045 +0000 UTC m=+218.069527936" lastFinishedPulling="2026-03-13 20:32:57.353902848 +0000 UTC m=+308.375018739" observedRunningTime="2026-03-13 20:32:58.513409137 +0000 UTC m=+309.534525058" watchObservedRunningTime="2026-03-13 20:32:58.516252264 +0000 UTC m=+309.537368165" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.529611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.558229 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.558295 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:32:59 crc kubenswrapper[4790]: I0313 20:32:59.594400 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" probeResult="failure" output=< Mar 13 20:32:59 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:32:59 crc kubenswrapper[4790]: > Mar 13 20:33:00 crc kubenswrapper[4790]: I0313 20:33:00.858330 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:33:00 crc kubenswrapper[4790]: I0313 20:33:00.858573 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mf4tm" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" containerID="cri-o://65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692" gracePeriod=2 Mar 13 20:33:01 crc kubenswrapper[4790]: E0313 20:33:01.982660 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1be7d98_ff3a_42bb_b8ff_4001814ae453.slice/crio-conmon-65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692.scope\": RecentStats: unable to find data in memory cache]" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.516874 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerID="65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692" exitCode=0 Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.516915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692"} Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.655931 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.756493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.756667 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.756709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.757611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities" (OuterVolumeSpecName: "utilities") pod "f1be7d98-ff3a-42bb-b8ff-4001814ae453" (UID: "f1be7d98-ff3a-42bb-b8ff-4001814ae453"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.762230 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv" (OuterVolumeSpecName: "kube-api-access-x5fzv") pod "f1be7d98-ff3a-42bb-b8ff-4001814ae453" (UID: "f1be7d98-ff3a-42bb-b8ff-4001814ae453"). InnerVolumeSpecName "kube-api-access-x5fzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.781542 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1be7d98-ff3a-42bb-b8ff-4001814ae453" (UID: "f1be7d98-ff3a-42bb-b8ff-4001814ae453"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.859008 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.859047 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.859060 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.524352 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"53cd4a75ebfee1686f2db1e566581c31a9b03470b4313025f3a980087eb27a00"} Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.524437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.526480 4790 scope.go:117] "RemoveContainer" containerID="65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.544555 4790 scope.go:117] "RemoveContainer" containerID="912e4880b7a24b954e780b6b21a914866c6b2e2fd8684cf3dc798b5f59ce287f" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.551542 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.554260 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.583650 4790 scope.go:117] "RemoveContainer" containerID="afed47472efd96d5fb96f1be65a82143aad59afc7569141f603e4362a1d44b0e" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.667834 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" path="/var/lib/kubelet/pods/f1be7d98-ff3a-42bb-b8ff-4001814ae453/volumes" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.933889 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.934137 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" containerID="cri-o://02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" gracePeriod=30 Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.966892 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.967226 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" containerID="cri-o://ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" gracePeriod=30 Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.520986 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.527347 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.531926 4790 generic.go:334] "Generic (PLEG): container finished" podID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" exitCode=0 Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.531965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.532011 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerDied","Data":"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.532043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerDied","Data":"e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.532064 4790 scope.go:117] "RemoveContainer" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.539932 4790 generic.go:334] "Generic (PLEG): container finished" podID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" exitCode=0 Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.539987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerDied","Data":"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.540006 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.540019 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerDied","Data":"0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.550206 4790 scope.go:117] "RemoveContainer" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" Mar 13 20:33:04 crc kubenswrapper[4790]: E0313 20:33:04.550701 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6\": container with ID starting with 02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6 not found: ID does not exist" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.550746 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6"} err="failed to get container status \"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6\": rpc error: code = NotFound desc = could not find container \"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6\": container with ID starting with 02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6 not found: ID does not exist" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.550775 4790 scope.go:117] "RemoveContainer" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.575187 4790 scope.go:117] "RemoveContainer" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" Mar 13 20:33:04 crc kubenswrapper[4790]: E0313 20:33:04.576253 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c\": container with ID starting with ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c not found: ID does not exist" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.576323 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c"} err="failed to get container status \"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c\": rpc error: code = NotFound desc = could not find container \"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c\": container with ID starting with ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c not found: ID does not exist" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687341 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687423 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687451 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687729 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688188 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688526 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca" (OuterVolumeSpecName: "client-ca") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688728 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config" (OuterVolumeSpecName: "config") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688911 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca" (OuterVolumeSpecName: "client-ca") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.689183 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config" (OuterVolumeSpecName: "config") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.692855 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr" (OuterVolumeSpecName: "kube-api-access-9z5jr") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "kube-api-access-9z5jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.692894 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r" (OuterVolumeSpecName: "kube-api-access-gh62r") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "kube-api-access-gh62r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.692953 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.693600 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788907 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788946 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788959 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788969 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788978 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788987 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788995 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.789003 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.789011 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.854822 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.854867 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.863981 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.869200 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.874442 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.877005 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.894992 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.201318 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.201620 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.246195 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.260268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.261565 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.309827 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.499914 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.500745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.547317 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.592659 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.600583 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.609944 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.616084 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.667356 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" path="/var/lib/kubelet/pods/677b8903-b2f7-437f-a96d-f72d1ed30de5/volumes" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.668053 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" path="/var/lib/kubelet/pods/caa9c85a-2ba8-49ea-804e-f3b63b511642/volumes" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.710742 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6"] Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711063 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711089 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711120 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-utilities" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711133 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-utilities" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711148 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711161 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711182 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-content" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711193 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-content" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711218 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711230 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711368 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711411 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711424 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.712010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.713789 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cdc6994c6-85s67"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.714496 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715128 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715607 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715791 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715940 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.716046 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.716159 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.717280 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.717558 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.718594 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.718834 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.719030 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.722012 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.722142 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.724359 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.726210 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdc6994c6-85s67"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.817296 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.902560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902a53b3-c223-40ae-9dd9-47830295158c-serving-cert\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.902840 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-config\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.902934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588cbe72-1cb6-4464-bba0-142104029595-serving-cert\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-config\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-client-ca\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-client-ca\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c6rg\" (UniqueName: \"kubernetes.io/projected/902a53b3-c223-40ae-9dd9-47830295158c-kube-api-access-6c6rg\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkrxm\" (UniqueName: \"kubernetes.io/projected/588cbe72-1cb6-4464-bba0-142104029595-kube-api-access-qkrxm\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903725 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-proxy-ca-bundles\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588cbe72-1cb6-4464-bba0-142104029595-serving-cert\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-config\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-client-ca\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004490 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-client-ca\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c6rg\" (UniqueName: \"kubernetes.io/projected/902a53b3-c223-40ae-9dd9-47830295158c-kube-api-access-6c6rg\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkrxm\" (UniqueName: \"kubernetes.io/projected/588cbe72-1cb6-4464-bba0-142104029595-kube-api-access-qkrxm\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004539 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-proxy-ca-bundles\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004567 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902a53b3-c223-40ae-9dd9-47830295158c-serving-cert\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-config\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.005849 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-client-ca\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.005921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-config\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.006284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-config\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.006671 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-client-ca\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.006981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-proxy-ca-bundles\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.011542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902a53b3-c223-40ae-9dd9-47830295158c-serving-cert\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.013644 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588cbe72-1cb6-4464-bba0-142104029595-serving-cert\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.023184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkrxm\" (UniqueName: \"kubernetes.io/projected/588cbe72-1cb6-4464-bba0-142104029595-kube-api-access-qkrxm\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.023670 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c6rg\" (UniqueName: \"kubernetes.io/projected/902a53b3-c223-40ae-9dd9-47830295158c-kube-api-access-6c6rg\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.046682 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.057682 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.285280 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6"] Mar 13 20:33:06 crc kubenswrapper[4790]: W0313 20:33:06.295058 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588cbe72_1cb6_4464_bba0_142104029595.slice/crio-be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24 WatchSource:0}: Error finding container be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24: Status 404 returned error can't find the container with id be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.556812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerStarted","Data":"be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24"} Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.578032 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdc6994c6-85s67"] Mar 13 20:33:06 crc kubenswrapper[4790]: W0313 20:33:06.590715 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod902a53b3_c223_40ae_9dd9_47830295158c.slice/crio-fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3 WatchSource:0}: Error finding container fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3: Status 404 returned error can't find the container with id fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.804427 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.806474 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815482 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815523 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815575 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.857436 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.886499 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.886891 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.886984 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.887025 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.887150 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.887048 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892090 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892638 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892680 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892701 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892717 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892742 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892759 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892778 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892793 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892817 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892837 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892877 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892893 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892916 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892931 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892951 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892967 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893203 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893231 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893249 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893371 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893430 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893644 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893674 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893699 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.894022 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.894047 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.894075 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.894092 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.894336 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916211 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916320 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916653 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916799 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916994 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.970425 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-6cdc6994c6-85s67.189c80d0a9eab648 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6cdc6994c6-85s67,UID:902a53b3-c223-40ae-9dd9-47830295158c,APIVersion:v1,ResourceVersion:29927,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,LastTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.017514 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.017563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.017588 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118587 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118659 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.145403 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: W0313 20:33:07.164234 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133 WatchSource:0}: Error finding container 1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133: Status 404 returned error can't find the container with id 1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.563668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.565286 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerStarted","Data":"4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.565757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.567398 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.567910 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.568297 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.568874 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerID="58f159651637d3217394d3f34d5549bae6158dd0fd270cdccfb0e48c45bc1c2d" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.568959 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerDied","Data":"58f159651637d3217394d3f34d5549bae6158dd0fd270cdccfb0e48c45bc1c2d"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.569615 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.569994 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.570461 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.571148 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.572035 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.573620 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574710 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574730 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574738 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574745 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" exitCode=2 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574769 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.576502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" event={"ID":"902a53b3-c223-40ae-9dd9-47830295158c","Type":"ContainerStarted","Data":"04e6ef7ed6cbd50b8c904b6fb505580ae5c2a5c3fcf9e7bcb50f0d6119d3ac05"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.576544 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" event={"ID":"902a53b3-c223-40ae-9dd9-47830295158c","Type":"ContainerStarted","Data":"fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.577935 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.578475 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.579329 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.580079 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.580522 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.565557 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.565883 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.585518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c"} Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.588429 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.589777 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.594415 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.595034 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.595474 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.595868 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.596084 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.596319 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.597271 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.597647 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.597890 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.598146 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.598430 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.598646 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.627872 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.628522 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.628899 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.629097 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.629277 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.629469 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.047089 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.048124 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.048745 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.049000 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.049280 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.049629 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.156685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"1c05d613-28a6-4eb7-b289-e7d1cad59990\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.156797 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"1c05d613-28a6-4eb7-b289-e7d1cad59990\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.156955 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"1c05d613-28a6-4eb7-b289-e7d1cad59990\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.157280 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c05d613-28a6-4eb7-b289-e7d1cad59990" (UID: "1c05d613-28a6-4eb7-b289-e7d1cad59990"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.157351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c05d613-28a6-4eb7-b289-e7d1cad59990" (UID: "1c05d613-28a6-4eb7-b289-e7d1cad59990"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.164300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c05d613-28a6-4eb7-b289-e7d1cad59990" (UID: "1c05d613-28a6-4eb7-b289-e7d1cad59990"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.258604 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.258643 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.258655 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.429233 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.430390 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.431104 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.432145 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.432645 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.433111 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.433642 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.434006 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561739 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561813 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561838 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561902 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.562083 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.562099 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.562111 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.590101 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.590170 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.596890 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.597094 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerDied","Data":"efd1d06a6ce25e4e3fca34226ca853275cb494e1f7d417b592640cdbae34182e"} Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.597150 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efd1d06a6ce25e4e3fca34226ca853275cb494e1f7d417b592640cdbae34182e" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.600037 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.601008 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" exitCode=0 Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.601083 4790 scope.go:117] "RemoveContainer" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.601166 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.602474 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.603091 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.603612 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.604147 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.604475 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.604810 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.610457 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.610788 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.616531 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.616805 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.617057 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.617214 4790 scope.go:117] "RemoveContainer" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.617458 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.622554 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.623084 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.623588 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.623862 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.624130 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.624489 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.629353 4790 scope.go:117] "RemoveContainer" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.641637 4790 scope.go:117] "RemoveContainer" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.654513 4790 scope.go:117] "RemoveContainer" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.663561 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.663842 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.664119 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.664618 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.664987 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.665280 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.665818 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.671584 4790 scope.go:117] "RemoveContainer" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.691278 4790 scope.go:117] "RemoveContainer" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.692208 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\": container with ID starting with 5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20 not found: ID does not exist" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692242 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20"} err="failed to get container status \"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\": rpc error: code = NotFound desc = could not find container \"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\": container with ID starting with 5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692265 4790 scope.go:117] "RemoveContainer" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.692632 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\": container with ID starting with d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8 not found: ID does not exist" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692675 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8"} err="failed to get container status \"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\": rpc error: code = NotFound desc = could not find container \"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\": container with ID starting with d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692705 4790 scope.go:117] "RemoveContainer" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.693077 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\": container with ID starting with c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5 not found: ID does not exist" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693143 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5"} err="failed to get container status \"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\": rpc error: code = NotFound desc = could not find container \"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\": container with ID starting with c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693202 4790 scope.go:117] "RemoveContainer" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.693616 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\": container with ID starting with 70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765 not found: ID does not exist" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693653 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765"} err="failed to get container status \"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\": rpc error: code = NotFound desc = could not find container \"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\": container with ID starting with 70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693675 4790 scope.go:117] "RemoveContainer" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.694158 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\": container with ID starting with 0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b not found: ID does not exist" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.694224 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b"} err="failed to get container status \"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\": rpc error: code = NotFound desc = could not find container \"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\": container with ID starting with 0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.694254 4790 scope.go:117] "RemoveContainer" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.694591 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\": container with ID starting with d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b not found: ID does not exist" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.694621 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b"} err="failed to get container status \"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\": rpc error: code = NotFound desc = could not find container \"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\": container with ID starting with d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.764357 4790 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" volumeName="registry-storage" Mar 13 20:33:13 crc kubenswrapper[4790]: E0313 20:33:13.189274 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-6cdc6994c6-85s67.189c80d0a9eab648 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6cdc6994c6-85s67,UID:902a53b3-c223-40ae-9dd9-47830295158c,APIVersion:v1,ResourceVersion:29927,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,LastTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.248484 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.249037 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.249495 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.249809 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.250078 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: I0313 20:33:14.250115 4790 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.250336 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.450857 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.852297 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Mar 13 20:33:15 crc kubenswrapper[4790]: E0313 20:33:15.653856 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Mar 13 20:33:17 crc kubenswrapper[4790]: I0313 20:33:17.047799 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:17 crc kubenswrapper[4790]: I0313 20:33:17.047859 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:17 crc kubenswrapper[4790]: E0313 20:33:17.255240 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="3.2s" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.659795 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.660504 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.661746 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.662012 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.662262 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.662533 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.672878 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.673002 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:18 crc kubenswrapper[4790]: E0313 20:33:18.673443 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.673925 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.664931 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.665647 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.665900 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.666250 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.666724 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.667011 4790 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.693893 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e5fbb63e2245590ceae285d78e589e8f5934bc1a24c72f3f77d23c9facc5745a" exitCode=0 Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.693936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e5fbb63e2245590ceae285d78e589e8f5934bc1a24c72f3f77d23c9facc5745a"} Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.693960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a93f5d4b50c5daa9e660832fd8936842c2289e878c0cf5cafd5b1c17e110a430"} Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.694195 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.694207 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.694623 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: E0313 20:33:19.694773 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695032 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695261 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695546 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695781 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.696072 4790 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c58e944651629812fb1f709b068ebfe7b62d91872a1307be5be697285ef730cc"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701596 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"01475734fe300f09a2d17bf8f8df03cdd784fad61de5cf01ddb519327c89b788"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b5555b03f24880db677285e78c50a4b4bf44c68867fd113ae11a8400217fd2d7"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6c3fced6e181d28f1686c7b499da58f2ea7f411ae0404aad88694a4e0c251831"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.703556 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.703991 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.704029 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe" exitCode=1 Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.704053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.704564 4790 scope.go:117] "RemoveContainer" containerID="341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.711448 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.712186 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.712294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4"} Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715321 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c8b521db5ddd733e8e2fe0d342090b0d72dc8a176c4374ba2a67b1e082ba497"} Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715523 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715551 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715568 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:22 crc kubenswrapper[4790]: I0313 20:33:22.699516 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:33:23 crc kubenswrapper[4790]: I0313 20:33:23.674950 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:23 crc kubenswrapper[4790]: I0313 20:33:23.675280 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:23 crc kubenswrapper[4790]: I0313 20:33:23.682861 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:26 crc kubenswrapper[4790]: I0313 20:33:26.725946 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.048411 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.048486 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.751291 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.751328 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.755986 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.758496 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6d72aeee-4f97-4816-89d8-511a753d2f70" Mar 13 20:33:28 crc kubenswrapper[4790]: I0313 20:33:28.757613 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:28 crc kubenswrapper[4790]: I0313 20:33:28.757646 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:29 crc kubenswrapper[4790]: I0313 20:33:29.699562 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6d72aeee-4f97-4816-89d8-511a753d2f70" Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.260860 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.261050 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.263250 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.850756 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" containerID="cri-o://4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383" gracePeriod=15 Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779317 4790 generic.go:334] "Generic (PLEG): container finished" podID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerID="4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383" exitCode=0 Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerDied","Data":"4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383"} Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779489 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerDied","Data":"7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54"} Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779517 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.800136 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982510 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982575 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982618 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982648 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982682 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982757 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982786 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982829 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982873 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982904 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.983000 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.984285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.984883 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.984936 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.998938 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.011958 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.014229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.014343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.021310 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.022724 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8" (OuterVolumeSpecName: "kube-api-access-9fdm8") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "kube-api-access-9fdm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.022841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.023579 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.023908 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.025530 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.036893 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084068 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084100 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084111 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084120 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084129 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084140 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084181 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084192 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084203 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084212 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084221 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084229 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084237 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084246 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.813215 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.047282 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.047330 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.048988 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.049094 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.864496 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-d97755bf4-2ssx6_588cbe72-1cb6-4464-bba0-142104029595/route-controller-manager/0.log" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.864574 4790 generic.go:334] "Generic (PLEG): container finished" podID="588cbe72-1cb6-4464-bba0-142104029595" containerID="4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30" exitCode=255 Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.864621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerDied","Data":"4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30"} Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.865270 4790 scope.go:117] "RemoveContainer" containerID="4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.184434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.382094 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.485023 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.707371 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.751145 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.872102 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-d97755bf4-2ssx6_588cbe72-1cb6-4464-bba0-142104029595/route-controller-manager/0.log" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.872149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerStarted","Data":"acb4633f1c59279d63c3c311f5e9691cd648254a5622f51f14f0b0357bc20516"} Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.872680 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.878864 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.038547 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.057220 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.107903 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.125690 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.216834 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.410694 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.515929 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.661897 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.733418 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.865410 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.872569 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.872646 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.879641 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.900310 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.928746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.009945 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.189509 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.230575 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.230660 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.324489 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.411069 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.713264 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.877814 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.878152 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.007166 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.038408 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.045639 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.171403 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.226645 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.246041 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.359432 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.375010 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.400823 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.484957 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.612720 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.621193 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.776517 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.819184 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.881652 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.903003 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.000101 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.000226 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.008259 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.058950 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.128339 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.139198 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.193273 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.241513 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.243671 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.284945 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.305584 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.359026 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.468571 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.650280 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.815650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.849114 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.849131 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.871736 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.078244 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.144973 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.168767 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.233046 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.304075 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.310389 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.334547 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.368559 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.419629 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.642042 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.731155 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.767054 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.880197 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.893857 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.922877 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.929142 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.936086 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.937928 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.969519 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.994422 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.996990 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.232641 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.343467 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.377434 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.474853 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.635345 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.658700 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.692848 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.724738 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.724977 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.809119 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.837558 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.871252 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.886079 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.937456 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.955179 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.067691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.159899 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.196510 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.218813 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.248966 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.353360 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.455905 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.496363 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.506350 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.523873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.543270 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.612094 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.672494 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.703163 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.785272 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.816502 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.831866 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.844950 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.051096 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.073969 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.224865 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.238010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.257511 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.315291 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.370504 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.433156 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.452902 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.468396 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.475529 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.501101 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.585770 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.617399 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.708119 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.727931 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.741027 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.769245 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.769286 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.805172 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.834782 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.879145 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.000842 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.099335 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.208322 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.424736 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.621934 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.645399 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.651563 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.661352 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.697780 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.871812 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.902896 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.917133 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.934994 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.982010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.006339 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.023488 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.080131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.193322 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.312133 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.327292 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.334901 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.348150 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.451156 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.481511 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.504809 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.544304 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.587841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.596519 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.685936 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.692274 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.794903 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.795589 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.818681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.855829 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.867832 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.892162 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.972240 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.099848 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.190206 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.274792 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.281981 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.286844 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.316402 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.326519 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.434024 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.535366 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.566003 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.586537 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.592155 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.593620 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.621137 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.621729 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.631115 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.709268 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.749412 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.033812 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.084407 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.134450 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.141070 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.170053 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.200750 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.230473 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.230530 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.230584 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.231225 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.231349 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4" gracePeriod=30 Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.523474 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.592901 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.706541 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.736928 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.769855 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.004253 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.182899 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.224689 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.285841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.340578 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.360685 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.525305 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.651617 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.659524 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.880826 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.989318 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.047014 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.203485 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.217715 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.288470 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.360756 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.408506 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.420760 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.443539 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.468984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.533441 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.563002 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.685119 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.926205 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.969645 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.119344 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.130169 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.281912 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.281979 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.380725 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.385272 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.620294 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.267075 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.268366 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=48.268340431 podStartE2EDuration="48.268340431s" podCreationTimestamp="2026-03-13 20:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:26.444772068 +0000 UTC m=+337.465887959" watchObservedRunningTime="2026-03-13 20:33:54.268340431 +0000 UTC m=+365.289456352" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.269915 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podStartSLOduration=51.269890374 podStartE2EDuration="51.269890374s" podCreationTimestamp="2026-03-13 20:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:26.435943188 +0000 UTC m=+337.457059079" watchObservedRunningTime="2026-03-13 20:33:54.269890374 +0000 UTC m=+365.291006305" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.273207 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" podStartSLOduration=51.273196173 podStartE2EDuration="51.273196173s" podCreationTimestamp="2026-03-13 20:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:26.422073972 +0000 UTC m=+337.443189873" watchObservedRunningTime="2026-03-13 20:33:54.273196173 +0000 UTC m=+365.294312104" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275116 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275229 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275823 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275852 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:54 crc kubenswrapper[4790]: E0313 20:33:54.276639 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerName="installer" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276670 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerName="installer" Mar 13 20:33:54 crc kubenswrapper[4790]: E0313 20:33:54.276711 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276729 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276949 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerName="installer" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276996 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.277888 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.281368 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.281854 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.282097 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.283238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.283552 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.283736 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285046 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285549 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285884 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.286606 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.287124 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.287454 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.292187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.296333 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.300755 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.305629 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.345810 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=28.345791624 podStartE2EDuration="28.345791624s" podCreationTimestamp="2026-03-13 20:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:54.312083849 +0000 UTC m=+365.333199750" watchObservedRunningTime="2026-03-13 20:33:54.345791624 +0000 UTC m=+365.366907525" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.397838 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.397880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-error\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.397967 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78308f12-fefa-41d3-845f-009863f92a51-audit-dir\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398005 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-service-ca\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-router-certs\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w8hp\" (UniqueName: \"kubernetes.io/projected/78308f12-fefa-41d3-845f-009863f92a51-kube-api-access-9w8hp\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-audit-policies\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-session\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398331 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-login\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398428 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398490 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.474768 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-login\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499576 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499596 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-error\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499623 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78308f12-fefa-41d3-845f-009863f92a51-audit-dir\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-service-ca\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499684 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-router-certs\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w8hp\" (UniqueName: \"kubernetes.io/projected/78308f12-fefa-41d3-845f-009863f92a51-kube-api-access-9w8hp\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499792 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-audit-policies\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-session\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.501188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78308f12-fefa-41d3-845f-009863f92a51-audit-dir\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.501783 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-audit-policies\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.502202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-service-ca\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.503183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.504349 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.506332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-router-certs\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.507016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-error\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.507220 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-login\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.509153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.511286 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.512321 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.513343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-session\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.513678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.521649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w8hp\" (UniqueName: \"kubernetes.io/projected/78308f12-fefa-41d3-845f-009863f92a51-kube-api-access-9w8hp\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.714669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.858803 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.932841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.158732 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq"] Mar 13 20:33:55 crc kubenswrapper[4790]: W0313 20:33:55.163795 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78308f12_fefa_41d3_845f_009863f92a51.slice/crio-b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f WatchSource:0}: Error finding container b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f: Status 404 returned error can't find the container with id b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.251533 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.674847 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" path="/var/lib/kubelet/pods/9680aeb7-b61a-46a8-baf5-44715261e4a5/volumes" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.961841 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" event={"ID":"78308f12-fefa-41d3-845f-009863f92a51","Type":"ContainerStarted","Data":"c13c4946072e428c448655c578855828cd450f099432a68247012368d8cfd9cf"} Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.961900 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" event={"ID":"78308f12-fefa-41d3-845f-009863f92a51","Type":"ContainerStarted","Data":"b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f"} Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.962098 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.988660 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" podStartSLOduration=50.988643961 podStartE2EDuration="50.988643961s" podCreationTimestamp="2026-03-13 20:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:55.985676141 +0000 UTC m=+367.006792032" watchObservedRunningTime="2026-03-13 20:33:55.988643961 +0000 UTC m=+367.009759852" Mar 13 20:33:56 crc kubenswrapper[4790]: I0313 20:33:56.170011 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:34:00 crc kubenswrapper[4790]: I0313 20:34:00.327744 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:34:00 crc kubenswrapper[4790]: I0313 20:34:00.328224 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" gracePeriod=5 Mar 13 20:34:05 crc kubenswrapper[4790]: I0313 20:34:05.897230 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 20:34:05 crc kubenswrapper[4790]: I0313 20:34:05.897917 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020320 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020400 4790 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" exitCode=137 Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020446 4790 scope.go:117] "RemoveContainer" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020485 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.045630 4790 scope.go:117] "RemoveContainer" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" Mar 13 20:34:06 crc kubenswrapper[4790]: E0313 20:34:06.046226 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c\": container with ID starting with 5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c not found: ID does not exist" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.046274 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c"} err="failed to get container status \"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c\": rpc error: code = NotFound desc = could not find container \"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c\": container with ID starting with 5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c not found: ID does not exist" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.053955 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054096 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054174 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054294 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054411 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054854 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054892 4790 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054920 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054944 4790 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.070755 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.156252 4790 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.667322 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.667834 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.681447 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.681480 4790 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7640fba-05c9-458e-a161-3dafdd60af62" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.685710 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.685749 4790 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7640fba-05c9-458e-a161-3dafdd60af62" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.399508 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:34:15 crc kubenswrapper[4790]: E0313 20:34:15.401279 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.401360 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.401595 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.402132 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.403839 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.405647 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.405914 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.415206 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.479180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"auto-csr-approver-29557234-6g6zh\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.580795 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"auto-csr-approver-29557234-6g6zh\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.612456 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"auto-csr-approver-29557234-6g6zh\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.729872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:16 crc kubenswrapper[4790]: I0313 20:34:16.130131 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:34:17 crc kubenswrapper[4790]: I0313 20:34:17.095640 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" event={"ID":"6b8e0ffa-a21f-4726-8185-2cff61c94b91","Type":"ContainerStarted","Data":"84311e084a32f72f4c06887a54648dd1d34db74a2eab079a4647daa3afee4d12"} Mar 13 20:34:18 crc kubenswrapper[4790]: I0313 20:34:18.103369 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerID="4a133641d0a543ddd92802af2ba335acfaf29e7ed5636f43383cb7790a817cba" exitCode=0 Mar 13 20:34:18 crc kubenswrapper[4790]: I0313 20:34:18.103499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" event={"ID":"6b8e0ffa-a21f-4726-8185-2cff61c94b91","Type":"ContainerDied","Data":"4a133641d0a543ddd92802af2ba335acfaf29e7ed5636f43383cb7790a817cba"} Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.447468 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.525722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.532593 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc" (OuterVolumeSpecName: "kube-api-access-4pvcc") pod "6b8e0ffa-a21f-4726-8185-2cff61c94b91" (UID: "6b8e0ffa-a21f-4726-8185-2cff61c94b91"). InnerVolumeSpecName "kube-api-access-4pvcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.627557 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:20 crc kubenswrapper[4790]: I0313 20:34:20.123889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" event={"ID":"6b8e0ffa-a21f-4726-8185-2cff61c94b91","Type":"ContainerDied","Data":"84311e084a32f72f4c06887a54648dd1d34db74a2eab079a4647daa3afee4d12"} Mar 13 20:34:20 crc kubenswrapper[4790]: I0313 20:34:20.124207 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84311e084a32f72f4c06887a54648dd1d34db74a2eab079a4647daa3afee4d12" Mar 13 20:34:20 crc kubenswrapper[4790]: I0313 20:34:20.123978 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.133056 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135022 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135551 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135591 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4" exitCode=137 Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4"} Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e56deb82f678c244918661e5eb7ebb160fe4919974a43590f2a4219eb47bf01"} Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135668 4790 scope.go:117] "RemoveContainer" containerID="341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe" Mar 13 20:34:22 crc kubenswrapper[4790]: I0313 20:34:22.142711 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 20:34:22 crc kubenswrapper[4790]: I0313 20:34:22.144050 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:34:22 crc kubenswrapper[4790]: I0313 20:34:22.699771 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:30 crc kubenswrapper[4790]: I0313 20:34:30.229942 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:30 crc kubenswrapper[4790]: I0313 20:34:30.237565 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:31 crc kubenswrapper[4790]: I0313 20:34:31.206644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:31 crc kubenswrapper[4790]: I0313 20:34:31.846738 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:34:31 crc kubenswrapper[4790]: I0313 20:34:31.846990 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-df8gv" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" containerID="cri-o://934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a" gracePeriod=2 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.050327 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.050974 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5tr4n" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" containerID="cri-o://283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38" gracePeriod=2 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.211582 4790 generic.go:334] "Generic (PLEG): container finished" podID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerID="934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a" exitCode=0 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.211657 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a"} Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.213804 4790 generic.go:334] "Generic (PLEG): container finished" podID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerID="283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38" exitCode=0 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.214469 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38"} Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.270013 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.388173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"da03af74-8c59-4ccf-aff8-03dc6303e322\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.388233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"da03af74-8c59-4ccf-aff8-03dc6303e322\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.388347 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"da03af74-8c59-4ccf-aff8-03dc6303e322\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.389034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities" (OuterVolumeSpecName: "utilities") pod "da03af74-8c59-4ccf-aff8-03dc6303e322" (UID: "da03af74-8c59-4ccf-aff8-03dc6303e322"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.389583 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.393891 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57" (OuterVolumeSpecName: "kube-api-access-fwk57") pod "da03af74-8c59-4ccf-aff8-03dc6303e322" (UID: "da03af74-8c59-4ccf-aff8-03dc6303e322"). InnerVolumeSpecName "kube-api-access-fwk57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.394931 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.439041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da03af74-8c59-4ccf-aff8-03dc6303e322" (UID: "da03af74-8c59-4ccf-aff8-03dc6303e322"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490278 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"446f0f4c-a97c-47d0-929d-0b99e07c8186\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"446f0f4c-a97c-47d0-929d-0b99e07c8186\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490514 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"446f0f4c-a97c-47d0-929d-0b99e07c8186\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490756 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490798 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.491478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities" (OuterVolumeSpecName: "utilities") pod "446f0f4c-a97c-47d0-929d-0b99e07c8186" (UID: "446f0f4c-a97c-47d0-929d-0b99e07c8186"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.494095 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw" (OuterVolumeSpecName: "kube-api-access-4dmtw") pod "446f0f4c-a97c-47d0-929d-0b99e07c8186" (UID: "446f0f4c-a97c-47d0-929d-0b99e07c8186"). InnerVolumeSpecName "kube-api-access-4dmtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.539472 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "446f0f4c-a97c-47d0-929d-0b99e07c8186" (UID: "446f0f4c-a97c-47d0-929d-0b99e07c8186"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.592067 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.592097 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.592109 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.220928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760"} Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.221031 4790 scope.go:117] "RemoveContainer" containerID="934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.220959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.225094 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"1f3bbc4d7d37e2d400e1366f116e79095d38ddf23a471dd30cc3d7e41c04740d"} Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.225216 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.240678 4790 scope.go:117] "RemoveContainer" containerID="69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.251787 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.255920 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.265851 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.282926 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.289360 4790 scope.go:117] "RemoveContainer" containerID="18d45729b57b0625b6ac059bc91aedd72d39045472cf08d5152f47c470f71f43" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.313646 4790 scope.go:117] "RemoveContainer" containerID="283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.332554 4790 scope.go:117] "RemoveContainer" containerID="58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.364446 4790 scope.go:117] "RemoveContainer" containerID="33326be198fd78688d8c0e82df3982727cfbc7e94ef4969d1503af495b1859ed" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.665721 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" path="/var/lib/kubelet/pods/446f0f4c-a97c-47d0-929d-0b99e07c8186/volumes" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.666742 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" path="/var/lib/kubelet/pods/da03af74-8c59-4ccf-aff8-03dc6303e322/volumes" Mar 13 20:34:37 crc kubenswrapper[4790]: I0313 20:34:37.847197 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:34:37 crc kubenswrapper[4790]: I0313 20:34:37.847930 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" containerID="cri-o://2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce" gracePeriod=2 Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.252567 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerID="2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce" exitCode=0 Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.252624 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce"} Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.313656 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.461048 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.461132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.461204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.462202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities" (OuterVolumeSpecName: "utilities") pod "4aa0c26b-aef8-49e9-9904-da9e8d029c9d" (UID: "4aa0c26b-aef8-49e9-9904-da9e8d029c9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.465804 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f" (OuterVolumeSpecName: "kube-api-access-vhj8f") pod "4aa0c26b-aef8-49e9-9904-da9e8d029c9d" (UID: "4aa0c26b-aef8-49e9-9904-da9e8d029c9d"). InnerVolumeSpecName "kube-api-access-vhj8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.562993 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.563107 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.591544 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa0c26b-aef8-49e9-9904-da9e8d029c9d" (UID: "4aa0c26b-aef8-49e9-9904-da9e8d029c9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.668125 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.260363 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa"} Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.260466 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.260723 4790 scope.go:117] "RemoveContainer" containerID="2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.274881 4790 scope.go:117] "RemoveContainer" containerID="324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.290771 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.295041 4790 scope.go:117] "RemoveContainer" containerID="f276b163ccc0d21403b49d02b3c506a94213d0bcc943d5fcede3603bc020ebfc" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.296233 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.665946 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" path="/var/lib/kubelet/pods/4aa0c26b-aef8-49e9-9904-da9e8d029c9d/volumes" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765041 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xf7s4"] Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765308 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765327 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765345 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765353 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765366 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765391 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765406 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765413 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765427 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerName="oc" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765435 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerName="oc" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765444 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765452 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765459 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765466 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765477 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765484 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765494 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765502 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765512 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765519 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765648 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765664 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerName="oc" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765682 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765697 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.766128 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.791648 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xf7s4"] Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.893632 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-certificates\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.893898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-trusted-ca\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894022 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894151 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894258 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-bound-sa-token\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsl42\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-kube-api-access-tsl42\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894461 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894543 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-tls\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.911793 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-bound-sa-token\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsl42\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-kube-api-access-tsl42\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995579 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-tls\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-certificates\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-trusted-ca\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.996490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.997126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-certificates\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.997140 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-trusted-ca\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.000438 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-tls\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.010326 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-bound-sa-token\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.016179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.019180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsl42\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-kube-api-access-tsl42\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.081304 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.471578 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xf7s4"] Mar 13 20:34:42 crc kubenswrapper[4790]: I0313 20:34:42.279885 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" event={"ID":"180ef86a-6ccb-4c72-9722-be08fb3c8bc7","Type":"ContainerStarted","Data":"54695274d9f3009dd8466f0dee6264bd5c891fc94c0257d695ff542d0cf8fe96"} Mar 13 20:34:42 crc kubenswrapper[4790]: I0313 20:34:42.280219 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:42 crc kubenswrapper[4790]: I0313 20:34:42.280231 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" event={"ID":"180ef86a-6ccb-4c72-9722-be08fb3c8bc7","Type":"ContainerStarted","Data":"13eae60b8677d2ee63a2140ef49608d4470d49da2be0ca0f3a2066fd422617ec"} Mar 13 20:34:44 crc kubenswrapper[4790]: I0313 20:34:44.015517 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:34:44 crc kubenswrapper[4790]: I0313 20:34:44.015573 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:35:01 crc kubenswrapper[4790]: I0313 20:35:01.086675 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:35:01 crc kubenswrapper[4790]: I0313 20:35:01.127007 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" podStartSLOduration=21.12697509 podStartE2EDuration="21.12697509s" podCreationTimestamp="2026-03-13 20:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:34:42.302464489 +0000 UTC m=+413.323580380" watchObservedRunningTime="2026-03-13 20:35:01.12697509 +0000 UTC m=+432.148091021" Mar 13 20:35:01 crc kubenswrapper[4790]: I0313 20:35:01.152358 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:35:14 crc kubenswrapper[4790]: I0313 20:35:14.015717 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:35:14 crc kubenswrapper[4790]: I0313 20:35:14.016544 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.043241 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.044081 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-txx64" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" containerID="cri-o://e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.081870 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.082148 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-672cv" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" containerID="cri-o://cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.088293 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.089439 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" containerID="cri-o://ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.090997 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n548b"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.091926 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.095168 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.095547 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bq4pj" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" containerID="cri-o://674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.099222 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.099493 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hnd2l" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" containerID="cri-o://85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.107086 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n548b"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.238592 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6jzp\" (UniqueName: \"kubernetes.io/projected/97fe66e8-7366-4c61-b1db-4d98459834da-kube-api-access-z6jzp\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.238951 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.239035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.340102 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.340215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6jzp\" (UniqueName: \"kubernetes.io/projected/97fe66e8-7366-4c61-b1db-4d98459834da-kube-api-access-z6jzp\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.340257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.346201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.354827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.375113 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6jzp\" (UniqueName: \"kubernetes.io/projected/97fe66e8-7366-4c61-b1db-4d98459834da-kube-api-access-z6jzp\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.414818 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.530160 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.532087 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.544052 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557453 4790 generic.go:334] "Generic (PLEG): container finished" podID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerDied","Data":"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557568 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerDied","Data":"bad985ac5d6a6fd6a14b185a97704f5e25df7aba222388f921733e6977b5b5eb"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557590 4790 scope.go:117] "RemoveContainer" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557623 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.570989 4790 generic.go:334] "Generic (PLEG): container finished" podID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerID="674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.571075 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.573962 4790 generic.go:334] "Generic (PLEG): container finished" podID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.574023 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.574057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"5bff08277bee799461658bd86530c13fa744a49d2daab25cbda9f9c23ac16aa2"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.574120 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.578548 4790 generic.go:334] "Generic (PLEG): container finished" podID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerID="85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.578633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580625 4790 generic.go:334] "Generic (PLEG): container finished" podID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580676 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"073e407a9eaa46913e8a833719c1712b0b191e45db2255328c3b799329f32f02"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580740 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.586601 4790 scope.go:117] "RemoveContainer" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.587333 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010\": container with ID starting with ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010 not found: ID does not exist" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.587387 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010"} err="failed to get container status \"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010\": rpc error: code = NotFound desc = could not find container \"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010\": container with ID starting with ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.587415 4790 scope.go:117] "RemoveContainer" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.613316 4790 scope.go:117] "RemoveContainer" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.615021 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.619345 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.642359 4790 scope.go:117] "RemoveContainer" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.642812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.642888 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"7080e6b3-5934-4c2c-9361-23d20b5a495e\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.643799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "53c38463-b7c5-42c8-a447-7d0e7f190aa9" (UID: "53c38463-b7c5-42c8-a447-7d0e7f190aa9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644101 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"7080e6b3-5934-4c2c-9361-23d20b5a495e\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644164 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"dbee8a79-e625-49ef-8fcb-944341ae6e37\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"dbee8a79-e625-49ef-8fcb-944341ae6e37\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644267 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"dbee8a79-e625-49ef-8fcb-944341ae6e37\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644303 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"7080e6b3-5934-4c2c-9361-23d20b5a495e\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644329 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644701 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644983 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities" (OuterVolumeSpecName: "utilities") pod "dbee8a79-e625-49ef-8fcb-944341ae6e37" (UID: "dbee8a79-e625-49ef-8fcb-944341ae6e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.648573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp" (OuterVolumeSpecName: "kube-api-access-zhkbp") pod "dbee8a79-e625-49ef-8fcb-944341ae6e37" (UID: "dbee8a79-e625-49ef-8fcb-944341ae6e37"). InnerVolumeSpecName "kube-api-access-zhkbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.648927 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities" (OuterVolumeSpecName: "utilities") pod "7080e6b3-5934-4c2c-9361-23d20b5a495e" (UID: "7080e6b3-5934-4c2c-9361-23d20b5a495e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.655141 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "53c38463-b7c5-42c8-a447-7d0e7f190aa9" (UID: "53c38463-b7c5-42c8-a447-7d0e7f190aa9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.655525 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct" (OuterVolumeSpecName: "kube-api-access-hskct") pod "7080e6b3-5934-4c2c-9361-23d20b5a495e" (UID: "7080e6b3-5934-4c2c-9361-23d20b5a495e"). InnerVolumeSpecName "kube-api-access-hskct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.655614 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl" (OuterVolumeSpecName: "kube-api-access-zfldl") pod "53c38463-b7c5-42c8-a447-7d0e7f190aa9" (UID: "53c38463-b7c5-42c8-a447-7d0e7f190aa9"). InnerVolumeSpecName "kube-api-access-zfldl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.675888 4790 scope.go:117] "RemoveContainer" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.676390 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723\": container with ID starting with e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723 not found: ID does not exist" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676429 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723"} err="failed to get container status \"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723\": rpc error: code = NotFound desc = could not find container \"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723\": container with ID starting with e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676457 4790 scope.go:117] "RemoveContainer" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.676770 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73\": container with ID starting with 4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73 not found: ID does not exist" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676801 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73"} err="failed to get container status \"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73\": rpc error: code = NotFound desc = could not find container \"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73\": container with ID starting with 4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676819 4790 scope.go:117] "RemoveContainer" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.677098 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566\": container with ID starting with 37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566 not found: ID does not exist" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.677128 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566"} err="failed to get container status \"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566\": rpc error: code = NotFound desc = could not find container \"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566\": container with ID starting with 37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.677149 4790 scope.go:117] "RemoveContainer" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.698182 4790 scope.go:117] "RemoveContainer" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.710718 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbee8a79-e625-49ef-8fcb-944341ae6e37" (UID: "dbee8a79-e625-49ef-8fcb-944341ae6e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.714341 4790 scope.go:117] "RemoveContainer" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.714926 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7080e6b3-5934-4c2c-9361-23d20b5a495e" (UID: "7080e6b3-5934-4c2c-9361-23d20b5a495e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.728894 4790 scope.go:117] "RemoveContainer" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.729276 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88\": container with ID starting with cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88 not found: ID does not exist" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729308 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88"} err="failed to get container status \"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88\": rpc error: code = NotFound desc = could not find container \"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88\": container with ID starting with cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729327 4790 scope.go:117] "RemoveContainer" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.729666 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91\": container with ID starting with 3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91 not found: ID does not exist" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729687 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91"} err="failed to get container status \"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91\": rpc error: code = NotFound desc = could not find container \"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91\": container with ID starting with 3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729700 4790 scope.go:117] "RemoveContainer" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.730056 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975\": container with ID starting with 721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975 not found: ID does not exist" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.730117 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975"} err="failed to get container status \"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975\": rpc error: code = NotFound desc = could not find container \"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975\": container with ID starting with 721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"36d32cb2-55c9-48cc-9376-66231ae66f8a\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745703 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745805 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"36d32cb2-55c9-48cc-9376-66231ae66f8a\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745859 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"36d32cb2-55c9-48cc-9376-66231ae66f8a\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745885 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746328 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746349 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746360 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746389 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746401 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746412 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746422 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746431 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.748438 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities" (OuterVolumeSpecName: "utilities") pod "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" (UID: "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.748860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities" (OuterVolumeSpecName: "utilities") pod "36d32cb2-55c9-48cc-9376-66231ae66f8a" (UID: "36d32cb2-55c9-48cc-9376-66231ae66f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.749781 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw" (OuterVolumeSpecName: "kube-api-access-7vdxw") pod "36d32cb2-55c9-48cc-9376-66231ae66f8a" (UID: "36d32cb2-55c9-48cc-9376-66231ae66f8a"). InnerVolumeSpecName "kube-api-access-7vdxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.751090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8" (OuterVolumeSpecName: "kube-api-access-xbqp8") pod "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" (UID: "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c"). InnerVolumeSpecName "kube-api-access-xbqp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.772642 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" (UID: "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847269 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847297 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847307 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847317 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847327 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.882935 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n548b"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.887733 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36d32cb2-55c9-48cc-9376-66231ae66f8a" (UID: "36d32cb2-55c9-48cc-9376-66231ae66f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.897473 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.897533 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.915604 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.919281 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.933586 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.937739 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.948642 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.592349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"48ce1cd0515d2f72905d7c3b45c89c2baec4ecf2f36741a13ea570b7bf830ee2"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.592730 4790 scope.go:117] "RemoveContainer" containerID="674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.592437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.596524 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" event={"ID":"97fe66e8-7366-4c61-b1db-4d98459834da","Type":"ContainerStarted","Data":"4411b6997bd2d48e537ea3015a3a92b116e5bb55ef646ff60a8f03599e1dd656"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.596644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.596718 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" event={"ID":"97fe66e8-7366-4c61-b1db-4d98459834da","Type":"ContainerStarted","Data":"241fd60f722d811433f2a0a1db304b3b11e77ad867ee343ee1aec7da09239dcf"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.601126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"5433561752fd3b8f83751ddd33926ccfe479acc64fdf830adcad528290d813de"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.601452 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.602208 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.615425 4790 scope.go:117] "RemoveContainer" containerID="fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.624226 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" podStartSLOduration=1.624179238 podStartE2EDuration="1.624179238s" podCreationTimestamp="2026-03-13 20:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:35:25.619619161 +0000 UTC m=+456.640735052" watchObservedRunningTime="2026-03-13 20:35:25.624179238 +0000 UTC m=+456.645295129" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.645678 4790 scope.go:117] "RemoveContainer" containerID="81d571ea6f444235cc217ca2f76bd3ade803e952dcea7fa197b363c62b207fc9" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.685284 4790 scope.go:117] "RemoveContainer" containerID="85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.686231 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" path="/var/lib/kubelet/pods/53c38463-b7c5-42c8-a447-7d0e7f190aa9/volumes" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.687731 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" path="/var/lib/kubelet/pods/7080e6b3-5934-4c2c-9361-23d20b5a495e/volumes" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.698909 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" path="/var/lib/kubelet/pods/dbee8a79-e625-49ef-8fcb-944341ae6e37/volumes" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699780 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699831 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699854 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699871 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.705428 4790 scope.go:117] "RemoveContainer" containerID="73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.721800 4790 scope.go:117] "RemoveContainer" containerID="4ccfbd25425ce912c32c0f73aa49b376929e5a036b5718d87d565520eab1f4ab" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.044799 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q5brt"] Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045754 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045790 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045804 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045813 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045822 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045832 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045844 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045852 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045864 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045872 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045883 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045892 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045906 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045916 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045928 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045936 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045946 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045955 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045968 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045976 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045990 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045998 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.046013 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046021 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.046033 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046041 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046151 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046166 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046175 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046191 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046206 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.047114 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.049626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.052086 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5brt"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.169976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-utilities\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.170026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-catalog-content\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.170064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdh84\" (UniqueName: \"kubernetes.io/projected/9e374399-85bd-4121-9352-23a37bdf41f3-kube-api-access-cdh84\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.194119 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" containerID="cri-o://1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" gracePeriod=30 Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.271398 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-utilities\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.271446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-catalog-content\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.271485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdh84\" (UniqueName: \"kubernetes.io/projected/9e374399-85bd-4121-9352-23a37bdf41f3-kube-api-access-cdh84\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.272152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-catalog-content\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.272156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-utilities\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.288814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdh84\" (UniqueName: \"kubernetes.io/projected/9e374399-85bd-4121-9352-23a37bdf41f3-kube-api-access-cdh84\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.365772 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.571395 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.572253 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5brt"] Mar 13 20:35:26 crc kubenswrapper[4790]: W0313 20:35:26.575960 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e374399_85bd_4121_9352_23a37bdf41f3.slice/crio-27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7 WatchSource:0}: Error finding container 27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7: Status 404 returned error can't find the container with id 27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7 Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.609248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerStarted","Data":"27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7"} Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611169 4790 generic.go:334] "Generic (PLEG): container finished" podID="81949470-5c0d-4294-8618-d6ee14da1d41" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" exitCode=0 Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611456 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611479 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerDied","Data":"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb"} Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerDied","Data":"403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6"} Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611564 4790 scope.go:117] "RemoveContainer" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.643713 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zhllj"] Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.647055 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.647191 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.647403 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.649652 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.651009 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhllj"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.651171 4790 scope.go:117] "RemoveContainer" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.651691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.652505 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb\": container with ID starting with 1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb not found: ID does not exist" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.652531 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb"} err="failed to get container status \"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb\": rpc error: code = NotFound desc = could not find container \"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb\": container with ID starting with 1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb not found: ID does not exist" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677144 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677212 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677259 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677288 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677330 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677352 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677541 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677966 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.679855 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687407 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687604 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8" (OuterVolumeSpecName: "kube-api-access-zf4v8") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "kube-api-access-zf4v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.697782 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.703252 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-catalog-content\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778832 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-utilities\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/0af82375-cffb-4861-82d2-5f1a0e4a8496-kube-api-access-rktt5\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778955 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778968 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778978 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778987 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778995 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.779003 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.779011 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.880598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/0af82375-cffb-4861-82d2-5f1a0e4a8496-kube-api-access-rktt5\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881009 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-catalog-content\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881034 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-utilities\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881475 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-catalog-content\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881739 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-utilities\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.896179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/0af82375-cffb-4861-82d2-5f1a0e4a8496-kube-api-access-rktt5\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.943266 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.947519 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.978093 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.175773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhllj"] Mar 13 20:35:27 crc kubenswrapper[4790]: W0313 20:35:27.183295 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af82375_cffb_4861_82d2_5f1a0e4a8496.slice/crio-2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45 WatchSource:0}: Error finding container 2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45: Status 404 returned error can't find the container with id 2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45 Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.619961 4790 generic.go:334] "Generic (PLEG): container finished" podID="0af82375-cffb-4861-82d2-5f1a0e4a8496" containerID="8d1a6ce11ec2379dc5ac63c4b98e51824c8d3e9f6f2f802ee9d1593ae4100871" exitCode=0 Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.620039 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerDied","Data":"8d1a6ce11ec2379dc5ac63c4b98e51824c8d3e9f6f2f802ee9d1593ae4100871"} Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.620110 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerStarted","Data":"2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45"} Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.622532 4790 generic.go:334] "Generic (PLEG): container finished" podID="9e374399-85bd-4121-9352-23a37bdf41f3" containerID="9bb0f14a40c31d619fccd1d6803a86e518927862e7ce130d3b5bf77d32f80c8e" exitCode=0 Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.622837 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerDied","Data":"9bb0f14a40c31d619fccd1d6803a86e518927862e7ce130d3b5bf77d32f80c8e"} Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.671848 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" path="/var/lib/kubelet/pods/36d32cb2-55c9-48cc-9376-66231ae66f8a/volumes" Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.672719 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" path="/var/lib/kubelet/pods/81949470-5c0d-4294-8618-d6ee14da1d41/volumes" Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.673742 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" path="/var/lib/kubelet/pods/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c/volumes" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.442466 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b7b2s"] Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.443971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.449832 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.457117 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7b2s"] Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.615158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-utilities\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.615231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx98v\" (UniqueName: \"kubernetes.io/projected/5ad984b4-e6a7-4559-99e4-02a03eda6303-kube-api-access-fx98v\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.615267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-catalog-content\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716332 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-utilities\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx98v\" (UniqueName: \"kubernetes.io/projected/5ad984b4-e6a7-4559-99e4-02a03eda6303-kube-api-access-fx98v\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-catalog-content\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716976 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-catalog-content\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.717168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-utilities\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.747467 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx98v\" (UniqueName: \"kubernetes.io/projected/5ad984b4-e6a7-4559-99e4-02a03eda6303-kube-api-access-fx98v\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.763123 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.047240 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.048433 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.052465 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.067844 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.171284 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7b2s"] Mar 13 20:35:29 crc kubenswrapper[4790]: W0313 20:35:29.207084 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad984b4_e6a7_4559_99e4_02a03eda6303.slice/crio-867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450 WatchSource:0}: Error finding container 867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450: Status 404 returned error can't find the container with id 867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.223257 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.223397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.223443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.324206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.324276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.324329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.325092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.325165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.345575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.372018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.652891 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ad984b4-e6a7-4559-99e4-02a03eda6303" containerID="1ea94cf34970da34b9edd56b52883436432d0846e4fd98c5242e8a483a82e44d" exitCode=0 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.652960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerDied","Data":"1ea94cf34970da34b9edd56b52883436432d0846e4fd98c5242e8a483a82e44d"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.653205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerStarted","Data":"867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.657697 4790 generic.go:334] "Generic (PLEG): container finished" podID="0af82375-cffb-4861-82d2-5f1a0e4a8496" containerID="34d3cba630965a127d3ad6e5a44b41ae0df1de3961f5bc2e0d603f3d76f2f11a" exitCode=0 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.657791 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerDied","Data":"34d3cba630965a127d3ad6e5a44b41ae0df1de3961f5bc2e0d603f3d76f2f11a"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.660498 4790 generic.go:334] "Generic (PLEG): container finished" podID="9e374399-85bd-4121-9352-23a37bdf41f3" containerID="8e432a91687e5677ab3de20e9449f793af10ac2bdba711d5b20cace3b9558a25" exitCode=0 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.668134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerDied","Data":"8e432a91687e5677ab3de20e9449f793af10ac2bdba711d5b20cace3b9558a25"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.744953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.666654 4790 generic.go:334] "Generic (PLEG): container finished" podID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" exitCode=0 Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.666733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.667013 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerStarted","Data":"abfa15f6de4daed047e18e5a602cd0577d104072963eda4b67a1d006df7fb930"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.669642 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerStarted","Data":"7bb0abea474f4c7b73c8dae1ac16fd59304353b192910368f5641e4fd30c5921"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.672269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerStarted","Data":"790b4f0b0842a0325ae2abf506443c4d02d22849427a074f57e3867741930deb"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.712250 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q5brt" podStartSLOduration=2.202912698 podStartE2EDuration="4.712231964s" podCreationTimestamp="2026-03-13 20:35:26 +0000 UTC" firstStartedPulling="2026-03-13 20:35:27.624977275 +0000 UTC m=+458.646093186" lastFinishedPulling="2026-03-13 20:35:30.134296521 +0000 UTC m=+461.155412452" observedRunningTime="2026-03-13 20:35:30.709538899 +0000 UTC m=+461.730654800" watchObservedRunningTime="2026-03-13 20:35:30.712231964 +0000 UTC m=+461.733347875" Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.732870 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zhllj" podStartSLOduration=2.293901086 podStartE2EDuration="4.73285099s" podCreationTimestamp="2026-03-13 20:35:26 +0000 UTC" firstStartedPulling="2026-03-13 20:35:27.620989214 +0000 UTC m=+458.642105125" lastFinishedPulling="2026-03-13 20:35:30.059939138 +0000 UTC m=+461.081055029" observedRunningTime="2026-03-13 20:35:30.729845366 +0000 UTC m=+461.750961267" watchObservedRunningTime="2026-03-13 20:35:30.73285099 +0000 UTC m=+461.753966881" Mar 13 20:35:31 crc kubenswrapper[4790]: I0313 20:35:31.678811 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ad984b4-e6a7-4559-99e4-02a03eda6303" containerID="4a73ccfcfd60d95f2e67e112f884ac61807932beb8afc3b52f8edbee5ed0f550" exitCode=0 Mar 13 20:35:31 crc kubenswrapper[4790]: I0313 20:35:31.678843 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerDied","Data":"4a73ccfcfd60d95f2e67e112f884ac61807932beb8afc3b52f8edbee5ed0f550"} Mar 13 20:35:32 crc kubenswrapper[4790]: I0313 20:35:32.687889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerStarted","Data":"537365b611324b431c87de833a1023e94baaefae81b46870a2ccc17d17c38283"} Mar 13 20:35:32 crc kubenswrapper[4790]: I0313 20:35:32.689994 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerStarted","Data":"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2"} Mar 13 20:35:32 crc kubenswrapper[4790]: I0313 20:35:32.720280 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b7b2s" podStartSLOduration=2.342240378 podStartE2EDuration="4.720254423s" podCreationTimestamp="2026-03-13 20:35:28 +0000 UTC" firstStartedPulling="2026-03-13 20:35:29.655426698 +0000 UTC m=+460.676542589" lastFinishedPulling="2026-03-13 20:35:32.033440743 +0000 UTC m=+463.054556634" observedRunningTime="2026-03-13 20:35:32.707899319 +0000 UTC m=+463.729015230" watchObservedRunningTime="2026-03-13 20:35:32.720254423 +0000 UTC m=+463.741370324" Mar 13 20:35:33 crc kubenswrapper[4790]: I0313 20:35:33.709142 4790 generic.go:334] "Generic (PLEG): container finished" podID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" exitCode=0 Mar 13 20:35:33 crc kubenswrapper[4790]: I0313 20:35:33.709269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2"} Mar 13 20:35:34 crc kubenswrapper[4790]: I0313 20:35:34.716300 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerStarted","Data":"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5"} Mar 13 20:35:34 crc kubenswrapper[4790]: I0313 20:35:34.733544 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cpxlj" podStartSLOduration=2.237924983 podStartE2EDuration="5.733527078s" podCreationTimestamp="2026-03-13 20:35:29 +0000 UTC" firstStartedPulling="2026-03-13 20:35:30.669448182 +0000 UTC m=+461.690564073" lastFinishedPulling="2026-03-13 20:35:34.165050267 +0000 UTC m=+465.186166168" observedRunningTime="2026-03-13 20:35:34.731404029 +0000 UTC m=+465.752519930" watchObservedRunningTime="2026-03-13 20:35:34.733527078 +0000 UTC m=+465.754642969" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.366688 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.366750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.435199 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.770287 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.978489 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.978781 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:37 crc kubenswrapper[4790]: I0313 20:35:37.015921 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:37 crc kubenswrapper[4790]: I0313 20:35:37.771793 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:38 crc kubenswrapper[4790]: I0313 20:35:38.763592 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:38 crc kubenswrapper[4790]: I0313 20:35:38.763644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.373203 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.373269 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.415865 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.783564 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.804932 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b7b2s" podUID="5ad984b4-e6a7-4559-99e4-02a03eda6303" containerName="registry-server" probeResult="failure" output=< Mar 13 20:35:39 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:35:39 crc kubenswrapper[4790]: > Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.015969 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016294 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016337 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016827 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016869 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6" gracePeriod=600 Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770446 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6" exitCode=0 Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6"} Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770714 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4"} Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770735 4790 scope.go:117] "RemoveContainer" containerID="a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400" Mar 13 20:35:48 crc kubenswrapper[4790]: I0313 20:35:48.820019 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:48 crc kubenswrapper[4790]: I0313 20:35:48.871304 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.134069 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.135302 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.138118 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.138199 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.138217 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.141109 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.198613 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"auto-csr-approver-29557236-tczbl\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.300095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"auto-csr-approver-29557236-tczbl\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.317632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"auto-csr-approver-29557236-tczbl\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.491481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.680843 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.868336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerStarted","Data":"ced10b9e5181bbae5848dbc4fffc41ceb2a125517a22f0d199aac485af29d451"} Mar 13 20:36:02 crc kubenswrapper[4790]: I0313 20:36:02.881564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerStarted","Data":"51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258"} Mar 13 20:36:02 crc kubenswrapper[4790]: I0313 20:36:02.902500 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557236-tczbl" podStartSLOduration=1.069087164 podStartE2EDuration="2.902473203s" podCreationTimestamp="2026-03-13 20:36:00 +0000 UTC" firstStartedPulling="2026-03-13 20:36:00.683313288 +0000 UTC m=+491.704429179" lastFinishedPulling="2026-03-13 20:36:02.516699327 +0000 UTC m=+493.537815218" observedRunningTime="2026-03-13 20:36:02.896793794 +0000 UTC m=+493.917909725" watchObservedRunningTime="2026-03-13 20:36:02.902473203 +0000 UTC m=+493.923589134" Mar 13 20:36:03 crc kubenswrapper[4790]: I0313 20:36:03.888082 4790 generic.go:334] "Generic (PLEG): container finished" podID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerID="51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258" exitCode=0 Mar 13 20:36:03 crc kubenswrapper[4790]: I0313 20:36:03.888145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerDied","Data":"51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258"} Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.127169 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.160830 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.167256 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s" (OuterVolumeSpecName: "kube-api-access-gg82s") pod "43b65fb5-f36b-4fae-ba13-03b5c81d1639" (UID: "43b65fb5-f36b-4fae-ba13-03b5c81d1639"). InnerVolumeSpecName "kube-api-access-gg82s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.262305 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") on node \"crc\" DevicePath \"\"" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.900289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerDied","Data":"ced10b9e5181bbae5848dbc4fffc41ceb2a125517a22f0d199aac485af29d451"} Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.900324 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ced10b9e5181bbae5848dbc4fffc41ceb2a125517a22f0d199aac485af29d451" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.900368 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.947779 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.952741 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:36:07 crc kubenswrapper[4790]: I0313 20:36:07.665971 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" path="/var/lib/kubelet/pods/d598b7c0-7c77-4903-9138-d8a3d01f9efe/volumes" Mar 13 20:37:28 crc kubenswrapper[4790]: I0313 20:37:28.455123 4790 scope.go:117] "RemoveContainer" containerID="4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383" Mar 13 20:37:44 crc kubenswrapper[4790]: I0313 20:37:44.016773 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:37:44 crc kubenswrapper[4790]: I0313 20:37:44.017250 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.135764 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:38:00 crc kubenswrapper[4790]: E0313 20:38:00.136435 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerName="oc" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.136450 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerName="oc" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.136586 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerName="oc" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.136995 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.141113 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.142864 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.144063 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.150190 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.191568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"auto-csr-approver-29557238-jx8wj\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.292931 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"auto-csr-approver-29557238-jx8wj\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.311006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"auto-csr-approver-29557238-jx8wj\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.502092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.697961 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.710100 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:38:01 crc kubenswrapper[4790]: I0313 20:38:01.632365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" event={"ID":"6c63bf97-e702-439a-8f3b-58d4496c91b9","Type":"ContainerStarted","Data":"18aff9ce0a963102ddae683328354fb98941cf77a7a279bb1519f12a72af6599"} Mar 13 20:38:02 crc kubenswrapper[4790]: I0313 20:38:02.640783 4790 generic.go:334] "Generic (PLEG): container finished" podID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerID="a1eeddc06106c1113c4a31e23128dada69c832330fa1711ed5544055f1b4392f" exitCode=0 Mar 13 20:38:02 crc kubenswrapper[4790]: I0313 20:38:02.640983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" event={"ID":"6c63bf97-e702-439a-8f3b-58d4496c91b9","Type":"ContainerDied","Data":"a1eeddc06106c1113c4a31e23128dada69c832330fa1711ed5544055f1b4392f"} Mar 13 20:38:03 crc kubenswrapper[4790]: I0313 20:38:03.871644 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.047014 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"6c63bf97-e702-439a-8f3b-58d4496c91b9\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.073240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk" (OuterVolumeSpecName: "kube-api-access-8rhdk") pod "6c63bf97-e702-439a-8f3b-58d4496c91b9" (UID: "6c63bf97-e702-439a-8f3b-58d4496c91b9"). InnerVolumeSpecName "kube-api-access-8rhdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.148617 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") on node \"crc\" DevicePath \"\"" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.655805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" event={"ID":"6c63bf97-e702-439a-8f3b-58d4496c91b9","Type":"ContainerDied","Data":"18aff9ce0a963102ddae683328354fb98941cf77a7a279bb1519f12a72af6599"} Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.655852 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18aff9ce0a963102ddae683328354fb98941cf77a7a279bb1519f12a72af6599" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.655897 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.933992 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.944757 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:38:05 crc kubenswrapper[4790]: I0313 20:38:05.667966 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" path="/var/lib/kubelet/pods/b190462f-7836-44f0-94c0-1311bdf8e550/volumes" Mar 13 20:38:14 crc kubenswrapper[4790]: I0313 20:38:14.016020 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:38:14 crc kubenswrapper[4790]: I0313 20:38:14.016721 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.015437 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.016146 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.016241 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.017507 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.017619 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4" gracePeriod=600 Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950312 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4" exitCode=0 Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4"} Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c"} Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950829 4790 scope.go:117] "RemoveContainer" containerID="88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6" Mar 13 20:39:28 crc kubenswrapper[4790]: I0313 20:39:28.521745 4790 scope.go:117] "RemoveContainer" containerID="8f1a4232fe3ee20e22f3a57d7811b303dba4631c6cf2890a09449767842fc5b4" Mar 13 20:39:28 crc kubenswrapper[4790]: I0313 20:39:28.573767 4790 scope.go:117] "RemoveContainer" containerID="7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.154761 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:40:00 crc kubenswrapper[4790]: E0313 20:40:00.157324 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerName="oc" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.157432 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerName="oc" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.157808 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerName="oc" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.158703 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.161232 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.161621 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.162100 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.164502 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.269484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"auto-csr-approver-29557240-8qw5d\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.371695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"auto-csr-approver-29557240-8qw5d\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.406564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"auto-csr-approver-29557240-8qw5d\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.485993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.715051 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:40:01 crc kubenswrapper[4790]: I0313 20:40:01.478983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerStarted","Data":"0ab6a1d896fc66193d6078f1d3865ee51f8ae31a5063281d02344bd55f9ed347"} Mar 13 20:40:02 crc kubenswrapper[4790]: I0313 20:40:02.488524 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerStarted","Data":"a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f"} Mar 13 20:40:02 crc kubenswrapper[4790]: I0313 20:40:02.502880 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" podStartSLOduration=1.061056131 podStartE2EDuration="2.502864877s" podCreationTimestamp="2026-03-13 20:40:00 +0000 UTC" firstStartedPulling="2026-03-13 20:40:00.728484933 +0000 UTC m=+731.749600824" lastFinishedPulling="2026-03-13 20:40:02.170293669 +0000 UTC m=+733.191409570" observedRunningTime="2026-03-13 20:40:02.500416273 +0000 UTC m=+733.521532164" watchObservedRunningTime="2026-03-13 20:40:02.502864877 +0000 UTC m=+733.523980758" Mar 13 20:40:03 crc kubenswrapper[4790]: I0313 20:40:03.495256 4790 generic.go:334] "Generic (PLEG): container finished" podID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerID="a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f" exitCode=0 Mar 13 20:40:03 crc kubenswrapper[4790]: I0313 20:40:03.495356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerDied","Data":"a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f"} Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.703187 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.825738 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.833597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn" (OuterVolumeSpecName: "kube-api-access-kt9fn") pod "f6f1fa3a-7f88-4e89-bd00-4426798fccce" (UID: "f6f1fa3a-7f88-4e89-bd00-4426798fccce"). InnerVolumeSpecName "kube-api-access-kt9fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.927805 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.511972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerDied","Data":"0ab6a1d896fc66193d6078f1d3865ee51f8ae31a5063281d02344bd55f9ed347"} Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.512424 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab6a1d896fc66193d6078f1d3865ee51f8ae31a5063281d02344bd55f9ed347" Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.512054 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.568565 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.574647 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.673094 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" path="/var/lib/kubelet/pods/6b8e0ffa-a21f-4726-8185-2cff61c94b91/volumes" Mar 13 20:40:28 crc kubenswrapper[4790]: I0313 20:40:28.661575 4790 scope.go:117] "RemoveContainer" containerID="4a133641d0a543ddd92802af2ba335acfaf29e7ed5636f43383cb7790a817cba" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.483860 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg"] Mar 13 20:40:35 crc kubenswrapper[4790]: E0313 20:40:35.484601 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerName="oc" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.484613 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerName="oc" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.484730 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerName="oc" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.485177 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.487425 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.487681 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5n769" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.487813 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.492104 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fgq7z"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.492931 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.498579 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7pmvg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.504573 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.519034 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p4h8t"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.519890 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.524904 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cns5w" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.529948 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fgq7z"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.539144 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p4h8t"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.560221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr4g9\" (UniqueName: \"kubernetes.io/projected/1430c143-e235-49e5-a141-78b9e3297b70-kube-api-access-mr4g9\") pod \"cert-manager-webhook-687f57d79b-p4h8t\" (UID: \"1430c143-e235-49e5-a141-78b9e3297b70\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.560313 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqp64\" (UniqueName: \"kubernetes.io/projected/f58ec868-a42c-463c-b65f-bf118fae6518-kube-api-access-gqp64\") pod \"cert-manager-cainjector-cf98fcc89-vfjwg\" (UID: \"f58ec868-a42c-463c-b65f-bf118fae6518\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.560360 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rd2\" (UniqueName: \"kubernetes.io/projected/c77372fb-0649-4c32-be4f-34c3dd515246-kube-api-access-r7rd2\") pod \"cert-manager-858654f9db-fgq7z\" (UID: \"c77372fb-0649-4c32-be4f-34c3dd515246\") " pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.662052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqp64\" (UniqueName: \"kubernetes.io/projected/f58ec868-a42c-463c-b65f-bf118fae6518-kube-api-access-gqp64\") pod \"cert-manager-cainjector-cf98fcc89-vfjwg\" (UID: \"f58ec868-a42c-463c-b65f-bf118fae6518\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.662103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rd2\" (UniqueName: \"kubernetes.io/projected/c77372fb-0649-4c32-be4f-34c3dd515246-kube-api-access-r7rd2\") pod \"cert-manager-858654f9db-fgq7z\" (UID: \"c77372fb-0649-4c32-be4f-34c3dd515246\") " pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.662178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr4g9\" (UniqueName: \"kubernetes.io/projected/1430c143-e235-49e5-a141-78b9e3297b70-kube-api-access-mr4g9\") pod \"cert-manager-webhook-687f57d79b-p4h8t\" (UID: \"1430c143-e235-49e5-a141-78b9e3297b70\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.682556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rd2\" (UniqueName: \"kubernetes.io/projected/c77372fb-0649-4c32-be4f-34c3dd515246-kube-api-access-r7rd2\") pod \"cert-manager-858654f9db-fgq7z\" (UID: \"c77372fb-0649-4c32-be4f-34c3dd515246\") " pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.683622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqp64\" (UniqueName: \"kubernetes.io/projected/f58ec868-a42c-463c-b65f-bf118fae6518-kube-api-access-gqp64\") pod \"cert-manager-cainjector-cf98fcc89-vfjwg\" (UID: \"f58ec868-a42c-463c-b65f-bf118fae6518\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.687677 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr4g9\" (UniqueName: \"kubernetes.io/projected/1430c143-e235-49e5-a141-78b9e3297b70-kube-api-access-mr4g9\") pod \"cert-manager-webhook-687f57d79b-p4h8t\" (UID: \"1430c143-e235-49e5-a141-78b9e3297b70\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.806403 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.822061 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.834669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.231363 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg"] Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.283641 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fgq7z"] Mar 13 20:40:36 crc kubenswrapper[4790]: W0313 20:40:36.284214 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77372fb_0649_4c32_be4f_34c3dd515246.slice/crio-b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499 WatchSource:0}: Error finding container b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499: Status 404 returned error can't find the container with id b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499 Mar 13 20:40:36 crc kubenswrapper[4790]: W0313 20:40:36.285703 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1430c143_e235_49e5_a141_78b9e3297b70.slice/crio-529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565 WatchSource:0}: Error finding container 529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565: Status 404 returned error can't find the container with id 529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565 Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.288347 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p4h8t"] Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.716147 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" event={"ID":"1430c143-e235-49e5-a141-78b9e3297b70","Type":"ContainerStarted","Data":"529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565"} Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.717836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fgq7z" event={"ID":"c77372fb-0649-4c32-be4f-34c3dd515246","Type":"ContainerStarted","Data":"b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499"} Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.718952 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" event={"ID":"f58ec868-a42c-463c-b65f-bf118fae6518","Type":"ContainerStarted","Data":"8b003f0a7c04ab13268103f2c0fe33fc373be8d6947436c5e9755e4aeb8d239a"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.745172 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" event={"ID":"f58ec868-a42c-463c-b65f-bf118fae6518","Type":"ContainerStarted","Data":"4bc28f7ed08d9aab402506f9e501d1f4f0a538ac2f7d888937e7cab8ccda1a95"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.746875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" event={"ID":"1430c143-e235-49e5-a141-78b9e3297b70","Type":"ContainerStarted","Data":"abdbe82f6d3c51720a6b25b25557fa1ad4e09a214dcf615f660a0c6dba440acc"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.747500 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.749051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fgq7z" event={"ID":"c77372fb-0649-4c32-be4f-34c3dd515246","Type":"ContainerStarted","Data":"4bf504403e04e8756565cb5837e470a7f72f492ccef20ce73fad57b8e3b45b46"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.767434 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" podStartSLOduration=1.664276987 podStartE2EDuration="5.767419096s" podCreationTimestamp="2026-03-13 20:40:35 +0000 UTC" firstStartedPulling="2026-03-13 20:40:36.243231928 +0000 UTC m=+767.264347819" lastFinishedPulling="2026-03-13 20:40:40.346373997 +0000 UTC m=+771.367489928" observedRunningTime="2026-03-13 20:40:40.760943834 +0000 UTC m=+771.782059725" watchObservedRunningTime="2026-03-13 20:40:40.767419096 +0000 UTC m=+771.788534987" Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.778319 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" podStartSLOduration=1.7246080799999999 podStartE2EDuration="5.778303523s" podCreationTimestamp="2026-03-13 20:40:35 +0000 UTC" firstStartedPulling="2026-03-13 20:40:36.289065949 +0000 UTC m=+767.310181840" lastFinishedPulling="2026-03-13 20:40:40.342761392 +0000 UTC m=+771.363877283" observedRunningTime="2026-03-13 20:40:40.774947784 +0000 UTC m=+771.796063675" watchObservedRunningTime="2026-03-13 20:40:40.778303523 +0000 UTC m=+771.799419414" Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.796352 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fgq7z" podStartSLOduration=1.668142988 podStartE2EDuration="5.79633436s" podCreationTimestamp="2026-03-13 20:40:35 +0000 UTC" firstStartedPulling="2026-03-13 20:40:36.286718827 +0000 UTC m=+767.307834718" lastFinishedPulling="2026-03-13 20:40:40.414910199 +0000 UTC m=+771.436026090" observedRunningTime="2026-03-13 20:40:40.794884091 +0000 UTC m=+771.815999982" watchObservedRunningTime="2026-03-13 20:40:40.79633436 +0000 UTC m=+771.817450251" Mar 13 20:40:44 crc kubenswrapper[4790]: I0313 20:40:44.015887 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:40:44 crc kubenswrapper[4790]: I0313 20:40:44.015955 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:40:45 crc kubenswrapper[4790]: I0313 20:40:45.837636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.015722 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.016767 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" containerID="cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017201 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" containerID="cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017250 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" containerID="cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017293 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" containerID="cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017332 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017377 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" containerID="cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017442 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" containerID="cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.113735 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" containerID="cri-o://78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.407492 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.409822 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-acl-logging/0.log" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.410428 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-controller/0.log" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.410922 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.465896 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-slnjx"] Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466462 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466480 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466494 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466501 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466512 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kubecfg-setup" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466521 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kubecfg-setup" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466532 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466541 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466557 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466565 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466572 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466582 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466595 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466602 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466611 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466619 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466630 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466637 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466649 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466657 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466667 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466675 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466819 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466831 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466841 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466853 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466865 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466874 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466886 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466894 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466903 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466912 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.467064 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.467076 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.467092 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.467101 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.467218 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.469130 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515438 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515632 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash" (OuterVolumeSpecName: "host-slash") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515734 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515788 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516005 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516038 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516069 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516105 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516134 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket" (OuterVolumeSpecName: "log-socket") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516179 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516164 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log" (OuterVolumeSpecName: "node-log") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516209 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516247 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516250 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516326 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516405 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516460 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516511 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516519 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516584 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516637 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516681 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516776 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-log-socket\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-var-lib-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517050 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517110 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-netns\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517151 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-slash\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-ovn\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-kubelet\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517331 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-netd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517352 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngt5w\" (UniqueName: \"kubernetes.io/projected/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-kube-api-access-ngt5w\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-systemd-units\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-script-lib\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517515 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-bin\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517549 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-env-overrides\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-systemd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-config\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-node-log\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517715 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-etc-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517795 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovn-node-metrics-cert\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517904 4790 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517922 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517936 4790 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517950 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517962 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517976 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517987 4790 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517999 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518011 4790 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518025 4790 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518038 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518051 4790 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518063 4790 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518076 4790 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518088 4790 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518100 4790 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518111 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.521275 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.522267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv" (OuterVolumeSpecName: "kube-api-access-h24bv") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "kube-api-access-h24bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.530148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-kubelet\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-netd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618906 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngt5w\" (UniqueName: \"kubernetes.io/projected/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-kube-api-access-ngt5w\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-systemd-units\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618946 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-script-lib\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-bin\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618990 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-env-overrides\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618998 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-netd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-systemd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-config\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-node-log\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-etc-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619157 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovn-node-metrics-cert\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-kubelet\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-log-socket\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-systemd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-log-socket\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619254 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-bin\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-var-lib-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-node-log\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619326 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-netns\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-slash\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619449 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-ovn\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-systemd-units\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619484 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-etc-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619567 4790 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619615 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619630 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-netns\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-var-lib-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619780 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-slash\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619816 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-ovn\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620166 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-env-overrides\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-script-lib\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620377 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-config\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.623750 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovn-node-metrics-cert\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.637115 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngt5w\" (UniqueName: \"kubernetes.io/projected/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-kube-api-access-ngt5w\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.786993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: W0313 20:40:55.813497 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ef6fca_53d7_43a3_8d94_3a29f09cefc7.slice/crio-a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9 WatchSource:0}: Error finding container a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9: Status 404 returned error can't find the container with id a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127148 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/2.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127723 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127770 4790 generic.go:334] "Generic (PLEG): container finished" podID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" containerID="5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221" exitCode=2 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerDied","Data":"5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127896 4790 scope.go:117] "RemoveContainer" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.128443 4790 scope.go:117] "RemoveContainer" containerID="5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.128662 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-x2tjg_openshift-multus(207e7f49-094a-4e59-a8ff-9eacd8d6fe2a)\"" pod="openshift-multus/multus-x2tjg" podUID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.134861 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.137452 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-acl-logging/0.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.137977 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-controller/0.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138445 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138530 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138586 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138647 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138702 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138757 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138811 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" exitCode=143 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138867 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" exitCode=143 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138675 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139060 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139148 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139180 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139190 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139197 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139204 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139211 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139219 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139227 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139234 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139261 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139271 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139282 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139292 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139299 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139307 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139313 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139337 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139345 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139352 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139358 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139366 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139419 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139428 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139435 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139442 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139449 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139456 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139507 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139514 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139521 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139527 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139547 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139556 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139594 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139604 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139610 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139617 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139624 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139630 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139636 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139643 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.141724 4790 generic.go:334] "Generic (PLEG): container finished" podID="53ef6fca-53d7-43a3-8d94-3a29f09cefc7" containerID="e1626ebac9dfac2a9c22f6978706c491f9807c012cee772ed96fdf2a048f10b7" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.141852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerDied","Data":"e1626ebac9dfac2a9c22f6978706c491f9807c012cee772ed96fdf2a048f10b7"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.141964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.185905 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.217694 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.221021 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.235312 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.266793 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.282271 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.301205 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.342889 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.363023 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.384207 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.420873 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.474636 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.490814 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.491317 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491355 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491395 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.491831 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491908 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491958 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.492471 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.492499 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.492520 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.493072 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493114 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493134 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.493614 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493635 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493674 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.494009 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494036 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494051 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.494399 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494423 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494440 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.494777 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494802 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494819 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.495115 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495141 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495158 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.495540 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495565 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495581 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495866 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495887 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496236 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496259 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496582 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496602 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496903 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496931 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497190 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497207 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497593 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497614 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497914 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497932 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498194 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498211 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498507 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498524 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498816 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498833 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499185 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499213 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499514 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499532 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499787 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499803 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500124 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500146 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500512 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500534 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500872 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500894 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501247 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501271 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501577 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501601 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501937 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501955 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502270 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502285 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502615 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502634 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502992 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503010 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503266 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503284 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503532 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503547 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504083 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504102 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504411 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504435 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504790 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504813 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505115 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505137 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505439 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505467 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505817 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.157683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"6aed25839fda9ef82da8cdf8a54bbb1153e9be0e50ace1d41afa4232a5c3f02d"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"510339f6f8da757231c4c47aac0c734cc2940ff0578e3fbd62814c9e118ff6b1"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"c406f3dd7ae8fb51d3ec4666101f9955cb3361b5a13f153799ea8b3d2c610d88"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"14ffe1cd22064fd24c3d3d662fbcc0523d30a139620cc52b7ecee44bebb49956"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"8b30842bff1c3268a3a2d67fa4f2cbb7177a0a9f34737190eb1176e1a2c70080"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158065 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"081c75b02f7fa77a1d992c1a2b12a291cb3a0bcb515cd3f115037d7250608bfe"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.159042 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/2.log" Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.672944 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" path="/var/lib/kubelet/pods/a0c9dff4-5508-4391-bb03-6710c2b9f3b5/volumes" Mar 13 20:41:00 crc kubenswrapper[4790]: I0313 20:41:00.189048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"490c72b57e53c5188ced2cc1e8a30d664b1c561d7d375b1398cdf59179252de3"} Mar 13 20:41:02 crc kubenswrapper[4790]: I0313 20:41:02.208697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"db83114523f379f590611fc9a77d035b663ae2f250efe670e96ad0e03365e2a2"} Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.215419 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.215997 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.244446 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.274937 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" podStartSLOduration=8.274915016 podStartE2EDuration="8.274915016s" podCreationTimestamp="2026-03-13 20:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:41:03.248802985 +0000 UTC m=+794.269918886" watchObservedRunningTime="2026-03-13 20:41:03.274915016 +0000 UTC m=+794.296030907" Mar 13 20:41:04 crc kubenswrapper[4790]: I0313 20:41:04.221538 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:04 crc kubenswrapper[4790]: I0313 20:41:04.257084 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:11 crc kubenswrapper[4790]: I0313 20:41:11.659723 4790 scope.go:117] "RemoveContainer" containerID="5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221" Mar 13 20:41:12 crc kubenswrapper[4790]: I0313 20:41:12.277878 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/2.log" Mar 13 20:41:12 crc kubenswrapper[4790]: I0313 20:41:12.278273 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"7d6d3b206a300169a846037d851026e58ef95aff89b8688100fcc3c7cd819164"} Mar 13 20:41:14 crc kubenswrapper[4790]: I0313 20:41:14.015894 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:41:14 crc kubenswrapper[4790]: I0313 20:41:14.016511 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.152575 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v"] Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.154487 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.157744 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v"] Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.158313 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.291431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.291494 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.291518 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392062 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392122 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392589 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.410004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.477904 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.707953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v"] Mar 13 20:41:22 crc kubenswrapper[4790]: W0313 20:41:22.714678 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b499fa_d8a4_4f3f_bcaf_aa9fa7b43854.slice/crio-2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3 WatchSource:0}: Error finding container 2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3: Status 404 returned error can't find the container with id 2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3 Mar 13 20:41:23 crc kubenswrapper[4790]: I0313 20:41:23.351077 4790 generic.go:334] "Generic (PLEG): container finished" podID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerID="29db7f843e5eb6932a590b6089f2077fb8134b027aedbf5bd8d88a8ecd0dfd07" exitCode=0 Mar 13 20:41:23 crc kubenswrapper[4790]: I0313 20:41:23.351135 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"29db7f843e5eb6932a590b6089f2077fb8134b027aedbf5bd8d88a8ecd0dfd07"} Mar 13 20:41:23 crc kubenswrapper[4790]: I0313 20:41:23.351163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerStarted","Data":"2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3"} Mar 13 20:41:25 crc kubenswrapper[4790]: I0313 20:41:25.363626 4790 generic.go:334] "Generic (PLEG): container finished" podID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerID="5c12e9d48e3107cfc1c450549d21e1d27c785d58c90ee901969e43971943f9c1" exitCode=0 Mar 13 20:41:25 crc kubenswrapper[4790]: I0313 20:41:25.363720 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"5c12e9d48e3107cfc1c450549d21e1d27c785d58c90ee901969e43971943f9c1"} Mar 13 20:41:25 crc kubenswrapper[4790]: I0313 20:41:25.818496 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:26 crc kubenswrapper[4790]: I0313 20:41:26.372839 4790 generic.go:334] "Generic (PLEG): container finished" podID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerID="58323afdd50e070f00b267a705c22daed7d2836118e84819cbc88623904dd505" exitCode=0 Mar 13 20:41:26 crc kubenswrapper[4790]: I0313 20:41:26.372884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"58323afdd50e070f00b267a705c22daed7d2836118e84819cbc88623904dd505"} Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.690359 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.783597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.819126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util" (OuterVolumeSpecName: "util") pod "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" (UID: "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.884935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.885074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.885671 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") on node \"crc\" DevicePath \"\"" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.886262 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle" (OuterVolumeSpecName: "bundle") pod "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" (UID: "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.891493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc" (OuterVolumeSpecName: "kube-api-access-2c7tc") pod "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" (UID: "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854"). InnerVolumeSpecName "kube-api-access-2c7tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.986479 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") on node \"crc\" DevicePath \"\"" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.986522 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:41:28 crc kubenswrapper[4790]: I0313 20:41:28.399872 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3"} Mar 13 20:41:28 crc kubenswrapper[4790]: I0313 20:41:28.399918 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3" Mar 13 20:41:28 crc kubenswrapper[4790]: I0313 20:41:28.399941 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.009414 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527537 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv"] Mar 13 20:41:33 crc kubenswrapper[4790]: E0313 20:41:33.527748 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="pull" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527761 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="pull" Mar 13 20:41:33 crc kubenswrapper[4790]: E0313 20:41:33.527778 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="extract" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527785 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="extract" Mar 13 20:41:33 crc kubenswrapper[4790]: E0313 20:41:33.527803 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="util" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527810 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="util" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527902 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="extract" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.528283 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.529960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-w2dt4" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.530038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.530522 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.543040 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv"] Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.661069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnmj\" (UniqueName: \"kubernetes.io/projected/4d5f9755-21a7-482e-8788-85ed86738b40-kube-api-access-7hnmj\") pod \"nmstate-operator-796d4cfff4-4lvtv\" (UID: \"4d5f9755-21a7-482e-8788-85ed86738b40\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.762918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnmj\" (UniqueName: \"kubernetes.io/projected/4d5f9755-21a7-482e-8788-85ed86738b40-kube-api-access-7hnmj\") pod \"nmstate-operator-796d4cfff4-4lvtv\" (UID: \"4d5f9755-21a7-482e-8788-85ed86738b40\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.781508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnmj\" (UniqueName: \"kubernetes.io/projected/4d5f9755-21a7-482e-8788-85ed86738b40-kube-api-access-7hnmj\") pod \"nmstate-operator-796d4cfff4-4lvtv\" (UID: \"4d5f9755-21a7-482e-8788-85ed86738b40\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.845206 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:34 crc kubenswrapper[4790]: I0313 20:41:34.283499 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv"] Mar 13 20:41:34 crc kubenswrapper[4790]: I0313 20:41:34.437184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" event={"ID":"4d5f9755-21a7-482e-8788-85ed86738b40","Type":"ContainerStarted","Data":"07a4557833fe001890bdeb37abfe81b6abb4be6a5e5df0e8dfd9dd8354ba3129"} Mar 13 20:41:37 crc kubenswrapper[4790]: I0313 20:41:37.456481 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" event={"ID":"4d5f9755-21a7-482e-8788-85ed86738b40","Type":"ContainerStarted","Data":"9cbf7026d5cf7dc8ace2a5809a91f0f78cd3b97654ae49ad9dced8d2f687e7a5"} Mar 13 20:41:37 crc kubenswrapper[4790]: I0313 20:41:37.492331 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" podStartSLOduration=2.296707132 podStartE2EDuration="4.492316912s" podCreationTimestamp="2026-03-13 20:41:33 +0000 UTC" firstStartedPulling="2026-03-13 20:41:34.294476464 +0000 UTC m=+825.315592355" lastFinishedPulling="2026-03-13 20:41:36.490086244 +0000 UTC m=+827.511202135" observedRunningTime="2026-03-13 20:41:37.489135785 +0000 UTC m=+828.510251676" watchObservedRunningTime="2026-03-13 20:41:37.492316912 +0000 UTC m=+828.513432803" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.304724 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.306070 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: W0313 20:41:42.309568 4790 reflector.go:561] object-"openshift-nmstate"/"nmstate-handler-dockercfg-dbjg2": failed to list *v1.Secret: secrets "nmstate-handler-dockercfg-dbjg2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Mar 13 20:41:42 crc kubenswrapper[4790]: E0313 20:41:42.309639 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-dbjg2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nmstate-handler-dockercfg-dbjg2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.355505 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.355588 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qld4w"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.356649 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.358601 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.368160 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-b2697"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.368976 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.390815 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qld4w"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.435178 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.435927 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.437691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.437746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.437827 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qr5bt" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.450195 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvzcz\" (UniqueName: \"kubernetes.io/projected/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-kube-api-access-xvzcz\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474718 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4p44\" (UniqueName: \"kubernetes.io/projected/e1a3b709-858c-4bca-b52b-c96dc23d9149-kube-api-access-q4p44\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-ovs-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474797 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpdg\" (UniqueName: \"kubernetes.io/projected/4295503b-996b-4a20-844b-07a90de225a6-kube-api-access-kbpdg\") pod \"nmstate-metrics-9b8c8685d-wvv95\" (UID: \"4295503b-996b-4a20-844b-07a90de225a6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474920 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-nmstate-lock\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.475029 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-dbus-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.475064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e1a3b709-858c-4bca-b52b-c96dc23d9149-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6wq\" (UniqueName: \"kubernetes.io/projected/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-kube-api-access-7s6wq\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-ovs-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576819 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpdg\" (UniqueName: \"kubernetes.io/projected/4295503b-996b-4a20-844b-07a90de225a6-kube-api-access-kbpdg\") pod \"nmstate-metrics-9b8c8685d-wvv95\" (UID: \"4295503b-996b-4a20-844b-07a90de225a6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576846 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576907 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-ovs-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-nmstate-lock\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-nmstate-lock\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-dbus-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577007 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e1a3b709-858c-4bca-b52b-c96dc23d9149-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvzcz\" (UniqueName: \"kubernetes.io/projected/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-kube-api-access-xvzcz\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4p44\" (UniqueName: \"kubernetes.io/projected/e1a3b709-858c-4bca-b52b-c96dc23d9149-kube-api-access-q4p44\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-dbus-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.591352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e1a3b709-858c-4bca-b52b-c96dc23d9149-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.602215 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpdg\" (UniqueName: \"kubernetes.io/projected/4295503b-996b-4a20-844b-07a90de225a6-kube-api-access-kbpdg\") pod \"nmstate-metrics-9b8c8685d-wvv95\" (UID: \"4295503b-996b-4a20-844b-07a90de225a6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.604864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4p44\" (UniqueName: \"kubernetes.io/projected/e1a3b709-858c-4bca-b52b-c96dc23d9149-kube-api-access-q4p44\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.615927 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvzcz\" (UniqueName: \"kubernetes.io/projected/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-kube-api-access-xvzcz\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.635073 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77d465584b-7dwm5"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.636345 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.658594 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d465584b-7dwm5"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.678642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.678750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6wq\" (UniqueName: \"kubernetes.io/projected/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-kube-api-access-7s6wq\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.678774 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.679539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.681532 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.697608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6wq\" (UniqueName: \"kubernetes.io/projected/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-kube-api-access-7s6wq\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.748845 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.779671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.779980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-oauth-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780127 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-trusted-ca-bundle\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh6z6\" (UniqueName: \"kubernetes.io/projected/7d4e30e7-0446-4370-bfad-e2824747e0fe-kube-api-access-kh6z6\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-oauth-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-service-ca\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-service-ca\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-oauth-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-trusted-ca-bundle\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh6z6\" (UniqueName: \"kubernetes.io/projected/7d4e30e7-0446-4370-bfad-e2824747e0fe-kube-api-access-kh6z6\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882372 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-oauth-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.884941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-oauth-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.885422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.885422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-trusted-ca-bundle\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.885544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-service-ca\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.888604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.898309 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-oauth-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.905950 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh6z6\" (UniqueName: \"kubernetes.io/projected/7d4e30e7-0446-4370-bfad-e2824747e0fe-kube-api-access-kh6z6\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.963581 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.161779 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs"] Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.242658 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d465584b-7dwm5"] Mar 13 20:41:43 crc kubenswrapper[4790]: W0313 20:41:43.247364 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4e30e7_0446_4370_bfad_e2824747e0fe.slice/crio-ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1 WatchSource:0}: Error finding container ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1: Status 404 returned error can't find the container with id ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1 Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.492237 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" event={"ID":"c7ef6baa-3c87-44a8-91d2-bcfbc0696396","Type":"ContainerStarted","Data":"846d6731c2416f6ca3400ce228a4640bf9f27862d957edf9e4d6432423abc67f"} Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.494224 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d465584b-7dwm5" event={"ID":"7d4e30e7-0446-4370-bfad-e2824747e0fe","Type":"ContainerStarted","Data":"895ba00e796bfdae6263f67d5d233af9b5adf62ff51fe803cc5bb3ef2ca47f23"} Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.494265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d465584b-7dwm5" event={"ID":"7d4e30e7-0446-4370-bfad-e2824747e0fe","Type":"ContainerStarted","Data":"ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1"} Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.625279 4790 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.625350 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.674241 4790 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.674321 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.690303 4790 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-handler-b2697" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.690415 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.722292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-dbjg2" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.852078 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77d465584b-7dwm5" podStartSLOduration=1.852060071 podStartE2EDuration="1.852060071s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:41:43.524534101 +0000 UTC m=+834.545650012" watchObservedRunningTime="2026-03-13 20:41:43.852060071 +0000 UTC m=+834.873175962" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.854540 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95"] Mar 13 20:41:43 crc kubenswrapper[4790]: W0313 20:41:43.864554 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4295503b_996b_4a20_844b_07a90de225a6.slice/crio-fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6 WatchSource:0}: Error finding container fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6: Status 404 returned error can't find the container with id fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6 Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.891411 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qld4w"] Mar 13 20:41:43 crc kubenswrapper[4790]: W0313 20:41:43.897875 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a3b709_858c_4bca_b52b_c96dc23d9149.slice/crio-2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32 WatchSource:0}: Error finding container 2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32: Status 404 returned error can't find the container with id 2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32 Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.015593 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.015648 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.015688 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.016059 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.016424 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c" gracePeriod=600 Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514365 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c" exitCode=0 Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514800 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514838 4790 scope.go:117] "RemoveContainer" containerID="876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.515812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b2697" event={"ID":"d5c9a572-635b-4ecc-a2a4-c7e459d6d510","Type":"ContainerStarted","Data":"ddc80cbcf43c4c96d31a8e3cc04162581c56b8a391408fde03ae1a1481dd63d9"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.518165 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" event={"ID":"e1a3b709-858c-4bca-b52b-c96dc23d9149","Type":"ContainerStarted","Data":"2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.519323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" event={"ID":"4295503b-996b-4a20-844b-07a90de225a6","Type":"ContainerStarted","Data":"fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6"} Mar 13 20:41:46 crc kubenswrapper[4790]: I0313 20:41:46.535607 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" event={"ID":"c7ef6baa-3c87-44a8-91d2-bcfbc0696396","Type":"ContainerStarted","Data":"b4cb991ef4d053abb965c3d016877324844701636042a2b010228ec59cbc5e5f"} Mar 13 20:41:46 crc kubenswrapper[4790]: I0313 20:41:46.562122 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" podStartSLOduration=2.177108334 podStartE2EDuration="4.562101622s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.169478351 +0000 UTC m=+834.190594242" lastFinishedPulling="2026-03-13 20:41:45.554471639 +0000 UTC m=+836.575587530" observedRunningTime="2026-03-13 20:41:46.558080795 +0000 UTC m=+837.579196686" watchObservedRunningTime="2026-03-13 20:41:46.562101622 +0000 UTC m=+837.583217513" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.542839 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b2697" event={"ID":"d5c9a572-635b-4ecc-a2a4-c7e459d6d510","Type":"ContainerStarted","Data":"ea4d76800a4baf79fd39c71f6201900141eeffb7999edd66a020107e37307343"} Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.543966 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.545336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" event={"ID":"e1a3b709-858c-4bca-b52b-c96dc23d9149","Type":"ContainerStarted","Data":"152fe3720b2999f06b7bcae5f9e4b3a19918cc4161b995157a04dba1b462b246"} Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.545431 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.546828 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" event={"ID":"4295503b-996b-4a20-844b-07a90de225a6","Type":"ContainerStarted","Data":"3490528d1ce2cf70c747456b42a262d605def2c49f3747afb1ee75fecbe7aa70"} Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.557239 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-b2697" podStartSLOduration=2.860254338 podStartE2EDuration="5.557219788s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.720225249 +0000 UTC m=+834.741341140" lastFinishedPulling="2026-03-13 20:41:46.417190699 +0000 UTC m=+837.438306590" observedRunningTime="2026-03-13 20:41:47.556665254 +0000 UTC m=+838.577781155" watchObservedRunningTime="2026-03-13 20:41:47.557219788 +0000 UTC m=+838.578335679" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.577383 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" podStartSLOduration=3.077947987 podStartE2EDuration="5.57735156s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.899830084 +0000 UTC m=+834.920945975" lastFinishedPulling="2026-03-13 20:41:46.399233657 +0000 UTC m=+837.420349548" observedRunningTime="2026-03-13 20:41:47.572127179 +0000 UTC m=+838.593243080" watchObservedRunningTime="2026-03-13 20:41:47.57735156 +0000 UTC m=+838.598467481" Mar 13 20:41:49 crc kubenswrapper[4790]: I0313 20:41:49.558912 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" event={"ID":"4295503b-996b-4a20-844b-07a90de225a6","Type":"ContainerStarted","Data":"8302713ee76c52b13528f8b8de7c7ab9f67e43244468b5763c009d7db89fa3a5"} Mar 13 20:41:49 crc kubenswrapper[4790]: I0313 20:41:49.608312 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" podStartSLOduration=2.213288966 podStartE2EDuration="7.608216323s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.866083088 +0000 UTC m=+834.887198979" lastFinishedPulling="2026-03-13 20:41:49.261010435 +0000 UTC m=+840.282126336" observedRunningTime="2026-03-13 20:41:49.583070147 +0000 UTC m=+840.604186088" watchObservedRunningTime="2026-03-13 20:41:49.608216323 +0000 UTC m=+840.629332264" Mar 13 20:41:52 crc kubenswrapper[4790]: I0313 20:41:52.964442 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:52 crc kubenswrapper[4790]: I0313 20:41:52.964717 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:52 crc kubenswrapper[4790]: I0313 20:41:52.971659 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:53 crc kubenswrapper[4790]: I0313 20:41:53.595608 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:53 crc kubenswrapper[4790]: I0313 20:41:53.656951 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:41:53 crc kubenswrapper[4790]: I0313 20:41:53.727630 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.141548 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.143228 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.146338 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.146975 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.147787 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.153512 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.235605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"auto-csr-approver-29557242-lp8qf\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.337113 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"auto-csr-approver-29557242-lp8qf\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.357830 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"auto-csr-approver-29557242-lp8qf\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.499521 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.941708 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:42:00 crc kubenswrapper[4790]: W0313 20:42:00.952684 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6027d153_5f8e_4bb1_8275_9a8df8c533f2.slice/crio-ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89 WatchSource:0}: Error finding container ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89: Status 404 returned error can't find the container with id ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89 Mar 13 20:42:01 crc kubenswrapper[4790]: I0313 20:42:01.648266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" event={"ID":"6027d153-5f8e-4bb1-8275-9a8df8c533f2","Type":"ContainerStarted","Data":"ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89"} Mar 13 20:42:02 crc kubenswrapper[4790]: I0313 20:42:02.656591 4790 generic.go:334] "Generic (PLEG): container finished" podID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerID="31ce3becbe5f9fc73efb71d7c9c70a67bb2549c4e27e76481e3678501a4317cf" exitCode=0 Mar 13 20:42:02 crc kubenswrapper[4790]: I0313 20:42:02.656716 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" event={"ID":"6027d153-5f8e-4bb1-8275-9a8df8c533f2","Type":"ContainerDied","Data":"31ce3becbe5f9fc73efb71d7c9c70a67bb2549c4e27e76481e3678501a4317cf"} Mar 13 20:42:03 crc kubenswrapper[4790]: I0313 20:42:03.682740 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:42:03 crc kubenswrapper[4790]: I0313 20:42:03.919727 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.092153 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.101434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4" (OuterVolumeSpecName: "kube-api-access-g2sz4") pod "6027d153-5f8e-4bb1-8275-9a8df8c533f2" (UID: "6027d153-5f8e-4bb1-8275-9a8df8c533f2"). InnerVolumeSpecName "kube-api-access-g2sz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.194353 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.674611 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" event={"ID":"6027d153-5f8e-4bb1-8275-9a8df8c533f2","Type":"ContainerDied","Data":"ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89"} Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.674666 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.674672 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.968167 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.974064 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:42:05 crc kubenswrapper[4790]: I0313 20:42:05.669123 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" path="/var/lib/kubelet/pods/43b65fb5-f36b-4fae-ba13-03b5c81d1639/volumes" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.153015 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px"] Mar 13 20:42:18 crc kubenswrapper[4790]: E0313 20:42:18.153800 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerName="oc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.153816 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerName="oc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.153974 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerName="oc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.155060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.160497 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px"] Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.162546 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.315320 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.315721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.315750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.416810 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.416864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.416917 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.417482 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.417485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.437839 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.478898 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.707242 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-q5j7f" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" containerID="cri-o://40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" gracePeriod=15 Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.860361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px"] Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.534324 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q5j7f_d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c/console/0.log" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.534408 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631049 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631337 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631476 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631978 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.632038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config" (OuterVolumeSpecName: "console-config") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.632453 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.632455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca" (OuterVolumeSpecName: "service-ca") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.637354 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v" (OuterVolumeSpecName: "kube-api-access-chx4v") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "kube-api-access-chx4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.637691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.638069 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732622 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732667 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732679 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732691 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732705 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732715 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732726 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.766854 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q5j7f_d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c/console/0.log" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.766934 4790 generic.go:334] "Generic (PLEG): container finished" podID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" exitCode=2 Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerDied","Data":"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767023 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767060 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerDied","Data":"af91b2c2002cfba8d95ebe9f9e0aa50107b9d61f68613dde04ff9ae4ab302650"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767087 4790 scope.go:117] "RemoveContainer" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.768969 4790 generic.go:334] "Generic (PLEG): container finished" podID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerID="73d77ad67ac4d15b04010b038b87d30e7703e7f28501c37a118699adcf6e336f" exitCode=0 Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.769051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"73d77ad67ac4d15b04010b038b87d30e7703e7f28501c37a118699adcf6e336f"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.769119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerStarted","Data":"b434b30cdb21943ca53f18eaf1729db4fddc700a45eee3938607a5e3f003edd9"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.782658 4790 scope.go:117] "RemoveContainer" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" Mar 13 20:42:19 crc kubenswrapper[4790]: E0313 20:42:19.783125 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80\": container with ID starting with 40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80 not found: ID does not exist" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.783161 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80"} err="failed to get container status \"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80\": rpc error: code = NotFound desc = could not find container \"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80\": container with ID starting with 40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80 not found: ID does not exist" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.813413 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.817861 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.509354 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:20 crc kubenswrapper[4790]: E0313 20:42:20.509759 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.509780 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.509996 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.511335 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.515541 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.659308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.659401 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.659433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.760661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.760716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.760739 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.761507 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.761548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.781024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.883060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.096793 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.668422 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" path="/var/lib/kubelet/pods/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c/volumes" Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.782289 4790 generic.go:334] "Generic (PLEG): container finished" podID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerID="30a13725d5a0929ecab855711341517cfdbcf9f6459a5c37ea3088910ca64874" exitCode=0 Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.782360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"30a13725d5a0929ecab855711341517cfdbcf9f6459a5c37ea3088910ca64874"} Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.783779 4790 generic.go:334] "Generic (PLEG): container finished" podID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" exitCode=0 Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.783826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16"} Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.783854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerStarted","Data":"17a8fc74803e09cccf53786141283b651260e0e7c4aaf11d9d5e161783ce7bac"} Mar 13 20:42:22 crc kubenswrapper[4790]: I0313 20:42:22.791984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerStarted","Data":"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e"} Mar 13 20:42:22 crc kubenswrapper[4790]: I0313 20:42:22.794207 4790 generic.go:334] "Generic (PLEG): container finished" podID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerID="8d8bbe3287546ed2b9b806b01bb8d444399ce245956ee3b45cb06c98793275c8" exitCode=0 Mar 13 20:42:22 crc kubenswrapper[4790]: I0313 20:42:22.794272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"8d8bbe3287546ed2b9b806b01bb8d444399ce245956ee3b45cb06c98793275c8"} Mar 13 20:42:23 crc kubenswrapper[4790]: I0313 20:42:23.813191 4790 generic.go:334] "Generic (PLEG): container finished" podID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" exitCode=0 Mar 13 20:42:23 crc kubenswrapper[4790]: I0313 20:42:23.813317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e"} Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.103260 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.303712 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"6940903a-9dc5-4001-bc87-9de2bdce9e52\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.304920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"6940903a-9dc5-4001-bc87-9de2bdce9e52\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.305009 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"6940903a-9dc5-4001-bc87-9de2bdce9e52\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.306028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle" (OuterVolumeSpecName: "bundle") pod "6940903a-9dc5-4001-bc87-9de2bdce9e52" (UID: "6940903a-9dc5-4001-bc87-9de2bdce9e52"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.311055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst" (OuterVolumeSpecName: "kube-api-access-mtwst") pod "6940903a-9dc5-4001-bc87-9de2bdce9e52" (UID: "6940903a-9dc5-4001-bc87-9de2bdce9e52"). InnerVolumeSpecName "kube-api-access-mtwst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.321857 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util" (OuterVolumeSpecName: "util") pod "6940903a-9dc5-4001-bc87-9de2bdce9e52" (UID: "6940903a-9dc5-4001-bc87-9de2bdce9e52"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.406775 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.406827 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.406845 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.821715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"b434b30cdb21943ca53f18eaf1729db4fddc700a45eee3938607a5e3f003edd9"} Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.822025 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b434b30cdb21943ca53f18eaf1729db4fddc700a45eee3938607a5e3f003edd9" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.821807 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.823629 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerStarted","Data":"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156"} Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.841608 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fwvj9" podStartSLOduration=2.242014551 podStartE2EDuration="4.841579993s" podCreationTimestamp="2026-03-13 20:42:20 +0000 UTC" firstStartedPulling="2026-03-13 20:42:21.785117607 +0000 UTC m=+872.806233508" lastFinishedPulling="2026-03-13 20:42:24.384683059 +0000 UTC m=+875.405798950" observedRunningTime="2026-03-13 20:42:24.840201245 +0000 UTC m=+875.861317146" watchObservedRunningTime="2026-03-13 20:42:24.841579993 +0000 UTC m=+875.862695884" Mar 13 20:42:28 crc kubenswrapper[4790]: I0313 20:42:28.774130 4790 scope.go:117] "RemoveContainer" containerID="51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258" Mar 13 20:42:30 crc kubenswrapper[4790]: I0313 20:42:30.884654 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:30 crc kubenswrapper[4790]: I0313 20:42:30.885146 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:31 crc kubenswrapper[4790]: I0313 20:42:31.937568 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fwvj9" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" probeResult="failure" output=< Mar 13 20:42:31 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:42:31 crc kubenswrapper[4790]: > Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210564 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54"] Mar 13 20:42:35 crc kubenswrapper[4790]: E0313 20:42:35.210792 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="util" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210803 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="util" Mar 13 20:42:35 crc kubenswrapper[4790]: E0313 20:42:35.210821 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="pull" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210827 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="pull" Mar 13 20:42:35 crc kubenswrapper[4790]: E0313 20:42:35.210837 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="extract" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210843 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="extract" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210950 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="extract" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.211339 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213686 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213708 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213769 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xh6l6" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.215298 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.227684 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.344739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-apiservice-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.344817 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg54\" (UniqueName: \"kubernetes.io/projected/da23093d-500f-43f4-805a-b4a252e40940-kube-api-access-lhg54\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.344848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-webhook-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.446090 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-apiservice-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.446146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg54\" (UniqueName: \"kubernetes.io/projected/da23093d-500f-43f4-805a-b4a252e40940-kube-api-access-lhg54\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.446173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-webhook-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.451887 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-webhook-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.464179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg54\" (UniqueName: \"kubernetes.io/projected/da23093d-500f-43f4-805a-b4a252e40940-kube-api-access-lhg54\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.468453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-apiservice-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.526421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.536365 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.537174 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.544755 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6jfg4" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.545001 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.547475 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.565196 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.648022 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-apiservice-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.648071 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7kq\" (UniqueName: \"kubernetes.io/projected/783be831-b522-42a0-9cbe-f234ed3a027c-kube-api-access-tw7kq\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.648157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-webhook-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.749428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-apiservice-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.749475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7kq\" (UniqueName: \"kubernetes.io/projected/783be831-b522-42a0-9cbe-f234ed3a027c-kube-api-access-tw7kq\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.749538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-webhook-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.754234 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-webhook-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.754623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-apiservice-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.768851 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7kq\" (UniqueName: \"kubernetes.io/projected/783be831-b522-42a0-9cbe-f234ed3a027c-kube-api-access-tw7kq\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.774184 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.890178 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" event={"ID":"da23093d-500f-43f4-805a-b4a252e40940","Type":"ContainerStarted","Data":"3f5fc4c636aafa39e2b75bfe1d26cc0ce009e8e3fa9f2c626da8ffb85d7cfb70"} Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.905137 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:36 crc kubenswrapper[4790]: I0313 20:42:36.329149 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2"] Mar 13 20:42:36 crc kubenswrapper[4790]: W0313 20:42:36.332865 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783be831_b522_42a0_9cbe_f234ed3a027c.slice/crio-cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350 WatchSource:0}: Error finding container cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350: Status 404 returned error can't find the container with id cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350 Mar 13 20:42:36 crc kubenswrapper[4790]: I0313 20:42:36.896303 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" event={"ID":"783be831-b522-42a0-9cbe-f234ed3a027c","Type":"ContainerStarted","Data":"cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350"} Mar 13 20:42:39 crc kubenswrapper[4790]: I0313 20:42:39.917142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" event={"ID":"da23093d-500f-43f4-805a-b4a252e40940","Type":"ContainerStarted","Data":"1ae63466639d55a8b537202415dad25349ac714c24132420120fa23ce9544150"} Mar 13 20:42:39 crc kubenswrapper[4790]: I0313 20:42:39.917874 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:39 crc kubenswrapper[4790]: I0313 20:42:39.943501 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" podStartSLOduration=1.951263334 podStartE2EDuration="4.943422428s" podCreationTimestamp="2026-03-13 20:42:35 +0000 UTC" firstStartedPulling="2026-03-13 20:42:35.785154721 +0000 UTC m=+886.806270622" lastFinishedPulling="2026-03-13 20:42:38.777313825 +0000 UTC m=+889.798429716" observedRunningTime="2026-03-13 20:42:39.937027165 +0000 UTC m=+890.958143066" watchObservedRunningTime="2026-03-13 20:42:39.943422428 +0000 UTC m=+890.964538319" Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.923597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" event={"ID":"783be831-b522-42a0-9cbe-f234ed3a027c","Type":"ContainerStarted","Data":"7bc869e32accf2119f4809ada290969661cc46360e4f1d973aa3e7018afa894f"} Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.939279 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.941483 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" podStartSLOduration=1.957889454 podStartE2EDuration="5.941470728s" podCreationTimestamp="2026-03-13 20:42:35 +0000 UTC" firstStartedPulling="2026-03-13 20:42:36.335809925 +0000 UTC m=+887.356925816" lastFinishedPulling="2026-03-13 20:42:40.319391199 +0000 UTC m=+891.340507090" observedRunningTime="2026-03-13 20:42:40.940154122 +0000 UTC m=+891.961270013" watchObservedRunningTime="2026-03-13 20:42:40.941470728 +0000 UTC m=+891.962586619" Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.988604 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:41 crc kubenswrapper[4790]: I0313 20:42:41.930711 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:42 crc kubenswrapper[4790]: I0313 20:42:42.686958 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:42 crc kubenswrapper[4790]: I0313 20:42:42.935851 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fwvj9" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" containerID="cri-o://e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" gracePeriod=2 Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.404926 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.575899 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"63d87b2a-7e33-4196-a549-c618ac863a8b\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.576053 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"63d87b2a-7e33-4196-a549-c618ac863a8b\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.576123 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"63d87b2a-7e33-4196-a549-c618ac863a8b\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.576951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities" (OuterVolumeSpecName: "utilities") pod "63d87b2a-7e33-4196-a549-c618ac863a8b" (UID: "63d87b2a-7e33-4196-a549-c618ac863a8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.581434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2" (OuterVolumeSpecName: "kube-api-access-trdh2") pod "63d87b2a-7e33-4196-a549-c618ac863a8b" (UID: "63d87b2a-7e33-4196-a549-c618ac863a8b"). InnerVolumeSpecName "kube-api-access-trdh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.677456 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.677520 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.722081 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63d87b2a-7e33-4196-a549-c618ac863a8b" (UID: "63d87b2a-7e33-4196-a549-c618ac863a8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.778690 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945161 4790 generic.go:334] "Generic (PLEG): container finished" podID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" exitCode=0 Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156"} Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"17a8fc74803e09cccf53786141283b651260e0e7c4aaf11d9d5e161783ce7bac"} Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945249 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945252 4790 scope.go:117] "RemoveContainer" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.972761 4790 scope.go:117] "RemoveContainer" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.973965 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.978704 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.998718 4790 scope.go:117] "RemoveContainer" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.017628 4790 scope.go:117] "RemoveContainer" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" Mar 13 20:42:44 crc kubenswrapper[4790]: E0313 20:42:44.018035 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156\": container with ID starting with e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156 not found: ID does not exist" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018068 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156"} err="failed to get container status \"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156\": rpc error: code = NotFound desc = could not find container \"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156\": container with ID starting with e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156 not found: ID does not exist" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018103 4790 scope.go:117] "RemoveContainer" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" Mar 13 20:42:44 crc kubenswrapper[4790]: E0313 20:42:44.018407 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e\": container with ID starting with 59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e not found: ID does not exist" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018433 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e"} err="failed to get container status \"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e\": rpc error: code = NotFound desc = could not find container \"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e\": container with ID starting with 59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e not found: ID does not exist" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018445 4790 scope.go:117] "RemoveContainer" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" Mar 13 20:42:44 crc kubenswrapper[4790]: E0313 20:42:44.018642 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16\": container with ID starting with e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16 not found: ID does not exist" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018663 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16"} err="failed to get container status \"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16\": rpc error: code = NotFound desc = could not find container \"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16\": container with ID starting with e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16 not found: ID does not exist" Mar 13 20:42:45 crc kubenswrapper[4790]: I0313 20:42:45.666472 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" path="/var/lib/kubelet/pods/63d87b2a-7e33-4196-a549-c618ac863a8b/volumes" Mar 13 20:42:55 crc kubenswrapper[4790]: I0313 20:42:55.909781 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:43:15 crc kubenswrapper[4790]: I0313 20:43:15.529343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.228747 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r97zs"] Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.229054 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-content" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229070 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-content" Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.229082 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-utilities" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229089 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-utilities" Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.229104 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229112 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229234 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.231361 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.232908 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.234212 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.235036 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.235096 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-skcdc" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.235221 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.237245 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.244751 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.321868 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5tk2m"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.322914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325069 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6v8lw" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325149 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325184 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325224 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.342647 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-czl9k"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.343559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.351656 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.358814 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-czl9k"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.413946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-sockets\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pshkb\" (UniqueName: \"kubernetes.io/projected/3ab7e856-a311-4e29-aabf-adaa27363613-kube-api-access-pshkb\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414263 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-reloader\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab7e856-a311-4e29-aabf-adaa27363613-metrics-certs\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-metrics\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414339 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/472cc73a-53fe-4d7c-aec8-b2154023ba90-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3ab7e856-a311-4e29-aabf-adaa27363613-frr-startup\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-conf\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414434 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72d28\" (UniqueName: \"kubernetes.io/projected/472cc73a-53fe-4d7c-aec8-b2154023ba90-kube-api-access-72d28\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515417 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72d28\" (UniqueName: \"kubernetes.io/projected/472cc73a-53fe-4d7c-aec8-b2154023ba90-kube-api-access-72d28\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-conf\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-sockets\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pshkb\" (UniqueName: \"kubernetes.io/projected/3ab7e856-a311-4e29-aabf-adaa27363613-kube-api-access-pshkb\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a3729738-ead5-47e0-95de-04dc39fb0516-metallb-excludel2\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515577 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7vz4\" (UniqueName: \"kubernetes.io/projected/a3729738-ead5-47e0-95de-04dc39fb0516-kube-api-access-h7vz4\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zgg\" (UniqueName: \"kubernetes.io/projected/d5ef8654-e56f-454b-9fae-0753a30dab0f-kube-api-access-x6zgg\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-metrics-certs\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-reloader\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515650 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab7e856-a311-4e29-aabf-adaa27363613-metrics-certs\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-metrics-certs\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515708 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-cert\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515734 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/472cc73a-53fe-4d7c-aec8-b2154023ba90-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515747 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-metrics\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-sockets\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516114 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-conf\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-reloader\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516826 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3ab7e856-a311-4e29-aabf-adaa27363613-frr-startup\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.517022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-metrics\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.517559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3ab7e856-a311-4e29-aabf-adaa27363613-frr-startup\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.525952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab7e856-a311-4e29-aabf-adaa27363613-metrics-certs\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.526460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/472cc73a-53fe-4d7c-aec8-b2154023ba90-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.534563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pshkb\" (UniqueName: \"kubernetes.io/projected/3ab7e856-a311-4e29-aabf-adaa27363613-kube-api-access-pshkb\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.538307 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72d28\" (UniqueName: \"kubernetes.io/projected/472cc73a-53fe-4d7c-aec8-b2154023ba90-kube-api-access-72d28\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.564395 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.573695 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.617982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a3729738-ead5-47e0-95de-04dc39fb0516-metallb-excludel2\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618025 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7vz4\" (UniqueName: \"kubernetes.io/projected/a3729738-ead5-47e0-95de-04dc39fb0516-kube-api-access-h7vz4\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zgg\" (UniqueName: \"kubernetes.io/projected/d5ef8654-e56f-454b-9fae-0753a30dab0f-kube-api-access-x6zgg\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-metrics-certs\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-metrics-certs\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-cert\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.619022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a3729738-ead5-47e0-95de-04dc39fb0516-metallb-excludel2\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.620056 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.620190 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist podName:a3729738-ead5-47e0-95de-04dc39fb0516 nodeName:}" failed. No retries permitted until 2026-03-13 20:43:17.12016708 +0000 UTC m=+928.141282971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist") pod "speaker-5tk2m" (UID: "a3729738-ead5-47e0-95de-04dc39fb0516") : secret "metallb-memberlist" not found Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.623441 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-metrics-certs\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.623947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-metrics-certs\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.627498 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.632711 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-cert\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.652505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7vz4\" (UniqueName: \"kubernetes.io/projected/a3729738-ead5-47e0-95de-04dc39fb0516-kube-api-access-h7vz4\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.655097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zgg\" (UniqueName: \"kubernetes.io/projected/d5ef8654-e56f-454b-9fae-0753a30dab0f-kube-api-access-x6zgg\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.660775 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.746338 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.125563 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:17 crc kubenswrapper[4790]: E0313 20:43:17.125978 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 20:43:17 crc kubenswrapper[4790]: E0313 20:43:17.126040 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist podName:a3729738-ead5-47e0-95de-04dc39fb0516 nodeName:}" failed. No retries permitted until 2026-03-13 20:43:18.126022573 +0000 UTC m=+929.147138464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist") pod "speaker-5tk2m" (UID: "a3729738-ead5-47e0-95de-04dc39fb0516") : secret "metallb-memberlist" not found Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.126822 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-czl9k"] Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.129819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8"] Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.276128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-czl9k" event={"ID":"d5ef8654-e56f-454b-9fae-0753a30dab0f","Type":"ContainerStarted","Data":"68fd06d4f6af3b0016e03fa0aadc5c6a3704ee615972d40e0574fd87d3734a67"} Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.276169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-czl9k" event={"ID":"d5ef8654-e56f-454b-9fae-0753a30dab0f","Type":"ContainerStarted","Data":"ea7004bcf36dbb656400b38b97f3df09925aa43298c8c545be2b91bed7e4efd7"} Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.278058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" event={"ID":"472cc73a-53fe-4d7c-aec8-b2154023ba90","Type":"ContainerStarted","Data":"e244eb751be0e2222b0a9e2d0b566d189bd0df4161e470b0c39884b6e54be354"} Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.279930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"a5c13d5df3d78f0f5c75b78a768ccfd97cea573967c0ed084bb8b8c745280933"} Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.140110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.146142 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.289430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-czl9k" event={"ID":"d5ef8654-e56f-454b-9fae-0753a30dab0f","Type":"ContainerStarted","Data":"d70dc94a0fc88513769f3de09d52205171892a67339c2b41e4f7a90c537ef9d4"} Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.290640 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.310960 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-czl9k" podStartSLOduration=2.310918768 podStartE2EDuration="2.310918768s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:43:18.307098594 +0000 UTC m=+929.328214485" watchObservedRunningTime="2026-03-13 20:43:18.310918768 +0000 UTC m=+929.332034659" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.445348 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:18 crc kubenswrapper[4790]: W0313 20:43:18.479257 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3729738_ead5_47e0_95de_04dc39fb0516.slice/crio-b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a WatchSource:0}: Error finding container b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a: Status 404 returned error can't find the container with id b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.300865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tk2m" event={"ID":"a3729738-ead5-47e0-95de-04dc39fb0516","Type":"ContainerStarted","Data":"3ae5de1a33157acac843c2f1b7002af2f2799488b9a72e5c21b2c1d9d878eaa1"} Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.301423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tk2m" event={"ID":"a3729738-ead5-47e0-95de-04dc39fb0516","Type":"ContainerStarted","Data":"f8d2f65b8e4e5b233e46774703123181a7404d39c2265bfe084a46e8ed71b1f9"} Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.301442 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tk2m" event={"ID":"a3729738-ead5-47e0-95de-04dc39fb0516","Type":"ContainerStarted","Data":"b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a"} Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.302641 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.326015 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5tk2m" podStartSLOduration=3.325996646 podStartE2EDuration="3.325996646s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:43:19.316464447 +0000 UTC m=+930.337580348" watchObservedRunningTime="2026-03-13 20:43:19.325996646 +0000 UTC m=+930.347112537" Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.344763 4790 generic.go:334] "Generic (PLEG): container finished" podID="3ab7e856-a311-4e29-aabf-adaa27363613" containerID="64b6e9a811b920351f37a00f8395a4bdfed37c50e7e92c2ab9d0b43fbfb9a502" exitCode=0 Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.344826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerDied","Data":"64b6e9a811b920351f37a00f8395a4bdfed37c50e7e92c2ab9d0b43fbfb9a502"} Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.347234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" event={"ID":"472cc73a-53fe-4d7c-aec8-b2154023ba90","Type":"ContainerStarted","Data":"46636b011c58b0a72156b97685eb290373197aaf928bdc7010bb84803ecaba6b"} Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.347513 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.384854 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" podStartSLOduration=1.668061799 podStartE2EDuration="8.384836357s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="2026-03-13 20:43:17.139767836 +0000 UTC m=+928.160883727" lastFinishedPulling="2026-03-13 20:43:23.856542394 +0000 UTC m=+934.877658285" observedRunningTime="2026-03-13 20:43:24.38165425 +0000 UTC m=+935.402770141" watchObservedRunningTime="2026-03-13 20:43:24.384836357 +0000 UTC m=+935.405952258" Mar 13 20:43:25 crc kubenswrapper[4790]: I0313 20:43:25.354553 4790 generic.go:334] "Generic (PLEG): container finished" podID="3ab7e856-a311-4e29-aabf-adaa27363613" containerID="0b052a2baddbeece7ca41bef76737dde35ce3c507cc3b2219e2854674ed991bd" exitCode=0 Mar 13 20:43:25 crc kubenswrapper[4790]: I0313 20:43:25.354614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerDied","Data":"0b052a2baddbeece7ca41bef76737dde35ce3c507cc3b2219e2854674ed991bd"} Mar 13 20:43:26 crc kubenswrapper[4790]: I0313 20:43:26.361900 4790 generic.go:334] "Generic (PLEG): container finished" podID="3ab7e856-a311-4e29-aabf-adaa27363613" containerID="7bf46f97328a6ee37e75f896df31fc2301c7e73214e16ae3f85cff47a0ad2a75" exitCode=0 Mar 13 20:43:26 crc kubenswrapper[4790]: I0313 20:43:26.361983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerDied","Data":"7bf46f97328a6ee37e75f896df31fc2301c7e73214e16ae3f85cff47a0ad2a75"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.376968 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"f1d31aaa0cb6d63b64969826ff66ad1625802ad0d839543120b8c6c1420816bf"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"c462c7755d333c97ba3a8fa96f732cd286e209277633fd3770b203861bb6f567"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"cdf225191adab2804534ddc3e506e7659f204d3375754b0fbbff8b6a55587198"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"3473d20eda94cf5cc80a4740762659ea126ca9510d1e872eeaea1be5650500d1"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377443 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"4921b57f72534d48dd03ace0e29e46079aa124fe14ae3c410c35da9961a8dcaa"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377500 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"0aba41e40c99fb6a7c39a3128a7c6b9ed7247a9e12c598623f8bfd63af710add"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.398077 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r97zs" podStartSLOduration=4.318485801 podStartE2EDuration="11.398049991s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="2026-03-13 20:43:16.746068376 +0000 UTC m=+927.767184267" lastFinishedPulling="2026-03-13 20:43:23.825632526 +0000 UTC m=+934.846748457" observedRunningTime="2026-03-13 20:43:27.395776779 +0000 UTC m=+938.416892680" watchObservedRunningTime="2026-03-13 20:43:27.398049991 +0000 UTC m=+938.419165872" Mar 13 20:43:28 crc kubenswrapper[4790]: I0313 20:43:28.448879 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.070716 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.072122 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.075160 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-f8kc8" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.076171 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.079548 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.081885 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.212995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"openstack-operator-index-bqc7r\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.313973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"openstack-operator-index-bqc7r\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.332303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"openstack-operator-index-bqc7r\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.399859 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.564862 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.603448 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.774098 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:31 crc kubenswrapper[4790]: W0313 20:43:31.781787 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92557fd1_85f4_48e5_9923_1d833bffe6d5.slice/crio-d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4 WatchSource:0}: Error finding container d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4: Status 404 returned error can't find the container with id d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4 Mar 13 20:43:32 crc kubenswrapper[4790]: I0313 20:43:32.404620 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerStarted","Data":"d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4"} Mar 13 20:43:34 crc kubenswrapper[4790]: I0313 20:43:34.452721 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.051790 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-58vcj"] Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.052833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.071324 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-58vcj"] Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.169644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l64wv\" (UniqueName: \"kubernetes.io/projected/db35ffd8-ac53-48ad-8035-53066c9df48b-kube-api-access-l64wv\") pod \"openstack-operator-index-58vcj\" (UID: \"db35ffd8-ac53-48ad-8035-53066c9df48b\") " pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.270727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l64wv\" (UniqueName: \"kubernetes.io/projected/db35ffd8-ac53-48ad-8035-53066c9df48b-kube-api-access-l64wv\") pod \"openstack-operator-index-58vcj\" (UID: \"db35ffd8-ac53-48ad-8035-53066c9df48b\") " pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.289063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l64wv\" (UniqueName: \"kubernetes.io/projected/db35ffd8-ac53-48ad-8035-53066c9df48b-kube-api-access-l64wv\") pod \"openstack-operator-index-58vcj\" (UID: \"db35ffd8-ac53-48ad-8035-53066c9df48b\") " pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.377177 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.436249 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerStarted","Data":"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4"} Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.436663 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-bqc7r" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" containerID="cri-o://42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" gracePeriod=2 Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.462868 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bqc7r" podStartSLOduration=1.482063315 podStartE2EDuration="4.46284764s" podCreationTimestamp="2026-03-13 20:43:31 +0000 UTC" firstStartedPulling="2026-03-13 20:43:31.783629668 +0000 UTC m=+942.804745559" lastFinishedPulling="2026-03-13 20:43:34.764413993 +0000 UTC m=+945.785529884" observedRunningTime="2026-03-13 20:43:35.455015457 +0000 UTC m=+946.476131348" watchObservedRunningTime="2026-03-13 20:43:35.46284764 +0000 UTC m=+946.483963531" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.604861 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-58vcj"] Mar 13 20:43:35 crc kubenswrapper[4790]: W0313 20:43:35.609572 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb35ffd8_ac53_48ad_8035_53066c9df48b.slice/crio-e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5 WatchSource:0}: Error finding container e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5: Status 404 returned error can't find the container with id e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5 Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.770437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.881881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"92557fd1-85f4-48e5-9923-1d833bffe6d5\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.887432 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s" (OuterVolumeSpecName: "kube-api-access-mxq7s") pod "92557fd1-85f4-48e5-9923-1d833bffe6d5" (UID: "92557fd1-85f4-48e5-9923-1d833bffe6d5"). InnerVolumeSpecName "kube-api-access-mxq7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.983196 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.446988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-58vcj" event={"ID":"db35ffd8-ac53-48ad-8035-53066c9df48b","Type":"ContainerStarted","Data":"dc77b656e09ee6c636ffbd3d7afbcaf2116a52871b4bd2d76dc5e9e500c3af2a"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.447061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-58vcj" event={"ID":"db35ffd8-ac53-48ad-8035-53066c9df48b","Type":"ContainerStarted","Data":"e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.448951 4790 generic.go:334] "Generic (PLEG): container finished" podID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" exitCode=0 Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.448988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerDied","Data":"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.449032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerDied","Data":"d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.449057 4790 scope.go:117] "RemoveContainer" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.449034 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.464341 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-58vcj" podStartSLOduration=1.402478769 podStartE2EDuration="1.464320198s" podCreationTimestamp="2026-03-13 20:43:35 +0000 UTC" firstStartedPulling="2026-03-13 20:43:35.612552161 +0000 UTC m=+946.633668052" lastFinishedPulling="2026-03-13 20:43:35.67439359 +0000 UTC m=+946.695509481" observedRunningTime="2026-03-13 20:43:36.462682765 +0000 UTC m=+947.483798686" watchObservedRunningTime="2026-03-13 20:43:36.464320198 +0000 UTC m=+947.485436089" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.473180 4790 scope.go:117] "RemoveContainer" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" Mar 13 20:43:36 crc kubenswrapper[4790]: E0313 20:43:36.473675 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4\": container with ID starting with 42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4 not found: ID does not exist" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.473718 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4"} err="failed to get container status \"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4\": rpc error: code = NotFound desc = could not find container \"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4\": container with ID starting with 42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4 not found: ID does not exist" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.484794 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.488995 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.568067 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.579749 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.665056 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:37 crc kubenswrapper[4790]: I0313 20:43:37.670562 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" path="/var/lib/kubelet/pods/92557fd1-85f4-48e5-9923-1d833bffe6d5/volumes" Mar 13 20:43:44 crc kubenswrapper[4790]: I0313 20:43:44.016352 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:43:44 crc kubenswrapper[4790]: I0313 20:43:44.016978 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.378227 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.379776 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.408242 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.462503 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:45 crc kubenswrapper[4790]: E0313 20:43:45.462772 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.462791 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.462900 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.463709 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.473901 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.526932 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.616444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.616498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.616587 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.717793 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.717843 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.717896 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.718224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.718265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.736825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.791577 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.068813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.511926 4790 generic.go:334] "Generic (PLEG): container finished" podID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" exitCode=0 Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.512038 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d"} Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.512405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerStarted","Data":"fbe0b7416d29f07efca01c0abb7eb4bb90760cb7b1eb3da1d6d801e7fefa45f8"} Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.086519 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk"] Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.088138 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.091643 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5h4dr" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.093132 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk"] Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.154900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.154974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.155093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.256516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.256845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.256969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.257340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.257530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.278887 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.402809 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.528246 4790 generic.go:334] "Generic (PLEG): container finished" podID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" exitCode=0 Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.528323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098"} Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.795206 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk"] Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.535646 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerID="9fd7e747c5f75aba3b14cf664cf0d79ce63a62bed0cc8cb9fe547f9eb8e037d7" exitCode=0 Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.535753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"9fd7e747c5f75aba3b14cf664cf0d79ce63a62bed0cc8cb9fe547f9eb8e037d7"} Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.536026 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerStarted","Data":"caed4dcfd5d370ca1eff47dafd8365437fd64756e69d6356a4612546b1258ad4"} Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.538658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerStarted","Data":"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e"} Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.575863 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kgpkk" podStartSLOduration=2.051199118 podStartE2EDuration="4.575842889s" podCreationTimestamp="2026-03-13 20:43:45 +0000 UTC" firstStartedPulling="2026-03-13 20:43:46.51309058 +0000 UTC m=+957.534206471" lastFinishedPulling="2026-03-13 20:43:49.037734351 +0000 UTC m=+960.058850242" observedRunningTime="2026-03-13 20:43:49.574304207 +0000 UTC m=+960.595420108" watchObservedRunningTime="2026-03-13 20:43:49.575842889 +0000 UTC m=+960.596958780" Mar 13 20:43:50 crc kubenswrapper[4790]: I0313 20:43:50.545933 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerID="f4164fba88cd5e16f917481b98911580189954a015fd7c0ae7792fd0306fe622" exitCode=0 Mar 13 20:43:50 crc kubenswrapper[4790]: I0313 20:43:50.545999 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"f4164fba88cd5e16f917481b98911580189954a015fd7c0ae7792fd0306fe622"} Mar 13 20:43:51 crc kubenswrapper[4790]: I0313 20:43:51.558144 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerID="7c073805963588cd60cce2eb9cea583f73b364e5e903872206f07b6527d29cfd" exitCode=0 Mar 13 20:43:51 crc kubenswrapper[4790]: I0313 20:43:51.558190 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"7c073805963588cd60cce2eb9cea583f73b364e5e903872206f07b6527d29cfd"} Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.795001 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.814821 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.814966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.814991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.815469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle" (OuterVolumeSpecName: "bundle") pod "4f787e63-2dda-4c6f-9c43-0b61658fed8c" (UID: "4f787e63-2dda-4c6f-9c43-0b61658fed8c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.818703 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.822012 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx" (OuterVolumeSpecName: "kube-api-access-ct4zx") pod "4f787e63-2dda-4c6f-9c43-0b61658fed8c" (UID: "4f787e63-2dda-4c6f-9c43-0b61658fed8c"). InnerVolumeSpecName "kube-api-access-ct4zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.830743 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util" (OuterVolumeSpecName: "util") pod "4f787e63-2dda-4c6f-9c43-0b61658fed8c" (UID: "4f787e63-2dda-4c6f-9c43-0b61658fed8c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.919979 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.920014 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:53 crc kubenswrapper[4790]: I0313 20:43:53.577713 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"caed4dcfd5d370ca1eff47dafd8365437fd64756e69d6356a4612546b1258ad4"} Mar 13 20:43:53 crc kubenswrapper[4790]: I0313 20:43:53.577776 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caed4dcfd5d370ca1eff47dafd8365437fd64756e69d6356a4612546b1258ad4" Mar 13 20:43:53 crc kubenswrapper[4790]: I0313 20:43:53.577782 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:55 crc kubenswrapper[4790]: I0313 20:43:55.792364 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:55 crc kubenswrapper[4790]: I0313 20:43:55.792750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:55 crc kubenswrapper[4790]: I0313 20:43:55.834197 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:56 crc kubenswrapper[4790]: I0313 20:43:56.634765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.985907 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t"] Mar 13 20:43:57 crc kubenswrapper[4790]: E0313 20:43:57.986751 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="util" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.986771 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="util" Mar 13 20:43:57 crc kubenswrapper[4790]: E0313 20:43:57.986808 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="pull" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.986817 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="pull" Mar 13 20:43:57 crc kubenswrapper[4790]: E0313 20:43:57.986837 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="extract" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.986847 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="extract" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.987094 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="extract" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.988060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.992847 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-w9x6j" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.003558 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t"] Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.083775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppw5\" (UniqueName: \"kubernetes.io/projected/87b8083b-23ab-4733-a7ac-85bf1e565551-kube-api-access-nppw5\") pod \"openstack-operator-controller-init-5c46d6fb64-bj72t\" (UID: \"87b8083b-23ab-4733-a7ac-85bf1e565551\") " pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.184144 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nppw5\" (UniqueName: \"kubernetes.io/projected/87b8083b-23ab-4733-a7ac-85bf1e565551-kube-api-access-nppw5\") pod \"openstack-operator-controller-init-5c46d6fb64-bj72t\" (UID: \"87b8083b-23ab-4733-a7ac-85bf1e565551\") " pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.203795 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppw5\" (UniqueName: \"kubernetes.io/projected/87b8083b-23ab-4733-a7ac-85bf1e565551-kube-api-access-nppw5\") pod \"openstack-operator-controller-init-5c46d6fb64-bj72t\" (UID: \"87b8083b-23ab-4733-a7ac-85bf1e565551\") " pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.250474 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.306653 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.607031 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kgpkk" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" containerID="cri-o://340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" gracePeriod=2 Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.721083 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t"] Mar 13 20:43:58 crc kubenswrapper[4790]: W0313 20:43:58.732282 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b8083b_23ab_4733_a7ac_85bf1e565551.slice/crio-7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a WatchSource:0}: Error finding container 7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a: Status 404 returned error can't find the container with id 7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.517901 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.613597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" event={"ID":"87b8083b-23ab-4733-a7ac-85bf1e565551","Type":"ContainerStarted","Data":"7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a"} Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.615912 4790 generic.go:334] "Generic (PLEG): container finished" podID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" exitCode=0 Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.615953 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.615983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e"} Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.616037 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"fbe0b7416d29f07efca01c0abb7eb4bb90760cb7b1eb3da1d6d801e7fefa45f8"} Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.616070 4790 scope.go:117] "RemoveContainer" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.637422 4790 scope.go:117] "RemoveContainer" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.657741 4790 scope.go:117] "RemoveContainer" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.697632 4790 scope.go:117] "RemoveContainer" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" Mar 13 20:43:59 crc kubenswrapper[4790]: E0313 20:43:59.700093 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e\": container with ID starting with 340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e not found: ID does not exist" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700143 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e"} err="failed to get container status \"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e\": rpc error: code = NotFound desc = could not find container \"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e\": container with ID starting with 340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e not found: ID does not exist" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700172 4790 scope.go:117] "RemoveContainer" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" Mar 13 20:43:59 crc kubenswrapper[4790]: E0313 20:43:59.700464 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098\": container with ID starting with 799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098 not found: ID does not exist" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700494 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098"} err="failed to get container status \"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098\": rpc error: code = NotFound desc = could not find container \"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098\": container with ID starting with 799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098 not found: ID does not exist" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700513 4790 scope.go:117] "RemoveContainer" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" Mar 13 20:43:59 crc kubenswrapper[4790]: E0313 20:43:59.700772 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d\": container with ID starting with 091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d not found: ID does not exist" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700800 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d"} err="failed to get container status \"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d\": rpc error: code = NotFound desc = could not find container \"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d\": container with ID starting with 091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d not found: ID does not exist" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.703301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.703488 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.703543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.704390 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities" (OuterVolumeSpecName: "utilities") pod "9d2e8f16-dbb0-48ce-ab69-fba11373e67a" (UID: "9d2e8f16-dbb0-48ce-ab69-fba11373e67a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.711067 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh" (OuterVolumeSpecName: "kube-api-access-8kwrh") pod "9d2e8f16-dbb0-48ce-ab69-fba11373e67a" (UID: "9d2e8f16-dbb0-48ce-ab69-fba11373e67a"). InnerVolumeSpecName "kube-api-access-8kwrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.730918 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d2e8f16-dbb0-48ce-ab69-fba11373e67a" (UID: "9d2e8f16-dbb0-48ce-ab69-fba11373e67a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.805118 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.805504 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.805525 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.954138 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.965966 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.154769 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:44:00 crc kubenswrapper[4790]: E0313 20:44:00.155265 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155287 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" Mar 13 20:44:00 crc kubenswrapper[4790]: E0313 20:44:00.155301 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-utilities" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155310 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-utilities" Mar 13 20:44:00 crc kubenswrapper[4790]: E0313 20:44:00.155323 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-content" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155335 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-content" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155550 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.156241 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.158310 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.160046 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.160298 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.161677 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.310520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"auto-csr-approver-29557244-sndr9\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.411788 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"auto-csr-approver-29557244-sndr9\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.429849 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"auto-csr-approver-29557244-sndr9\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.475887 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:01 crc kubenswrapper[4790]: I0313 20:44:01.671395 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" path="/var/lib/kubelet/pods/9d2e8f16-dbb0-48ce-ab69-fba11373e67a/volumes" Mar 13 20:44:02 crc kubenswrapper[4790]: I0313 20:44:02.940016 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.666980 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.667278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" event={"ID":"87b8083b-23ab-4733-a7ac-85bf1e565551","Type":"ContainerStarted","Data":"855ab8ac8ecde547b97686731e931e4ea878b7ff76196b393fca2fe9f0074695"} Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.667297 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557244-sndr9" event={"ID":"7f42b93e-6de8-423c-a2d5-dd57885de32c","Type":"ContainerStarted","Data":"d8f4e3de2382a875a84d1c776e64fd6b6600c72b4dbf64d6c90df645bb558dd6"} Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.690151 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" podStartSLOduration=2.781983377 podStartE2EDuration="6.690132941s" podCreationTimestamp="2026-03-13 20:43:57 +0000 UTC" firstStartedPulling="2026-03-13 20:43:58.735078056 +0000 UTC m=+969.756193947" lastFinishedPulling="2026-03-13 20:44:02.64322762 +0000 UTC m=+973.664343511" observedRunningTime="2026-03-13 20:44:03.689184376 +0000 UTC m=+974.710300267" watchObservedRunningTime="2026-03-13 20:44:03.690132941 +0000 UTC m=+974.711248832" Mar 13 20:44:04 crc kubenswrapper[4790]: I0313 20:44:04.669903 4790 generic.go:334] "Generic (PLEG): container finished" podID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerID="721d15acd59eb0b2b9f8d48eaa51f02f0b2b5cc626d1243f5a398968f008ce5a" exitCode=0 Mar 13 20:44:04 crc kubenswrapper[4790]: I0313 20:44:04.670367 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557244-sndr9" event={"ID":"7f42b93e-6de8-423c-a2d5-dd57885de32c","Type":"ContainerDied","Data":"721d15acd59eb0b2b9f8d48eaa51f02f0b2b5cc626d1243f5a398968f008ce5a"} Mar 13 20:44:05 crc kubenswrapper[4790]: I0313 20:44:05.905101 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.096154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"7f42b93e-6de8-423c-a2d5-dd57885de32c\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.101202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn" (OuterVolumeSpecName: "kube-api-access-kwhhn") pod "7f42b93e-6de8-423c-a2d5-dd57885de32c" (UID: "7f42b93e-6de8-423c-a2d5-dd57885de32c"). InnerVolumeSpecName "kube-api-access-kwhhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.198885 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.682416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557244-sndr9" event={"ID":"7f42b93e-6de8-423c-a2d5-dd57885de32c","Type":"ContainerDied","Data":"d8f4e3de2382a875a84d1c776e64fd6b6600c72b4dbf64d6c90df645bb558dd6"} Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.682472 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8f4e3de2382a875a84d1c776e64fd6b6600c72b4dbf64d6c90df645bb558dd6" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.682516 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.954018 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.966801 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:44:07 crc kubenswrapper[4790]: I0313 20:44:07.669948 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" path="/var/lib/kubelet/pods/6c63bf97-e702-439a-8f3b-58d4496c91b9/volumes" Mar 13 20:44:08 crc kubenswrapper[4790]: I0313 20:44:08.310078 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:44:14 crc kubenswrapper[4790]: I0313 20:44:14.018494 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:44:14 crc kubenswrapper[4790]: I0313 20:44:14.018800 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.285416 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:27 crc kubenswrapper[4790]: E0313 20:44:27.286256 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerName="oc" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.286272 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerName="oc" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.286433 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerName="oc" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.287395 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.301031 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.403508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.403825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.403987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.505661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.505715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.505758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.506223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.506442 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.526305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.614004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.881186 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.823016 4790 generic.go:334] "Generic (PLEG): container finished" podID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerID="13883d616d7859b8b1f4e3643b2470ceb4a60d0faba96109c31a1ecc31533caa" exitCode=0 Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.823447 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"13883d616d7859b8b1f4e3643b2470ceb4a60d0faba96109c31a1ecc31533caa"} Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.823477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerStarted","Data":"94c84ec1662023adbd79b891587ec02bac606782a1b69fbe98e2395146aadf04"} Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.888837 4790 scope.go:117] "RemoveContainer" containerID="a1eeddc06106c1113c4a31e23128dada69c832330fa1711ed5544055f1b4392f" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.954392 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8p67"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.955463 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.957711 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c789s" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.961272 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.962085 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.965730 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xm7hh" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.969647 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8p67"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.980027 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.992028 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.992935 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.995727 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6gbht" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.008457 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.054580 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.056208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.056313 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.061590 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-zth67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.089497 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.090464 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.093648 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8wnbp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.099804 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.111278 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.112467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.119996 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9wgrk" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.120336 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.140006 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.140882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.145724 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.149532 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-km8xp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.152560 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.153362 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.154884 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nc2qj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sznf\" (UniqueName: \"kubernetes.io/projected/bdbe5269-1150-4269-bc28-1d719f1b77b6-kube-api-access-7sznf\") pod \"barbican-operator-controller-manager-d47688694-s8p67\" (UID: \"bdbe5269-1150-4269-bc28-1d719f1b77b6\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzgl8\" (UniqueName: \"kubernetes.io/projected/dd8df218-c492-4e48-93a9-f5f2dbf7fc00-kube-api-access-rzgl8\") pod \"cinder-operator-controller-manager-984cd4dcf-5plwh\" (UID: \"dd8df218-c492-4e48-93a9-f5f2dbf7fc00\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxdfb\" (UniqueName: \"kubernetes.io/projected/46fb44a5-f567-4f58-80b1-dd70694f9339-kube-api-access-xxdfb\") pod \"designate-operator-controller-manager-66d56f6ff4-h7rc9\" (UID: \"46fb44a5-f567-4f58-80b1-dd70694f9339\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164994 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.165977 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.176915 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2lhj2" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.184874 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.194661 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.198000 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.201792 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-n6tgh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.224770 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.233510 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.261016 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.261805 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.265276 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8lttf" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.265987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9wn\" (UniqueName: \"kubernetes.io/projected/460b6997-f558-4e5f-9e15-aa33fece4f4b-kube-api-access-5j9wn\") pod \"horizon-operator-controller-manager-6d9d6b584d-nzdzx\" (UID: \"460b6997-f558-4e5f-9e15-aa33fece4f4b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbqlp\" (UniqueName: \"kubernetes.io/projected/77f24ce6-bc52-4831-902c-255983a8f911-kube-api-access-sbqlp\") pod \"keystone-operator-controller-manager-684f77d66d-5vcsg\" (UID: \"77f24ce6-bc52-4831-902c-255983a8f911\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzgl8\" (UniqueName: \"kubernetes.io/projected/dd8df218-c492-4e48-93a9-f5f2dbf7fc00-kube-api-access-rzgl8\") pod \"cinder-operator-controller-manager-984cd4dcf-5plwh\" (UID: \"dd8df218-c492-4e48-93a9-f5f2dbf7fc00\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxdfb\" (UniqueName: \"kubernetes.io/projected/46fb44a5-f567-4f58-80b1-dd70694f9339-kube-api-access-xxdfb\") pod \"designate-operator-controller-manager-66d56f6ff4-h7rc9\" (UID: \"46fb44a5-f567-4f58-80b1-dd70694f9339\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266172 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8qt2\" (UniqueName: \"kubernetes.io/projected/7caf7136-8a46-410b-8a32-72ab19e8baca-kube-api-access-h8qt2\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sznf\" (UniqueName: \"kubernetes.io/projected/bdbe5269-1150-4269-bc28-1d719f1b77b6-kube-api-access-7sznf\") pod \"barbican-operator-controller-manager-d47688694-s8p67\" (UID: \"bdbe5269-1150-4269-bc28-1d719f1b77b6\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266277 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krblr\" (UniqueName: \"kubernetes.io/projected/e154cc44-2769-4bfe-b8ef-3f6c56f08f74-kube-api-access-krblr\") pod \"glance-operator-controller-manager-5964f64c48-tzx96\" (UID: \"e154cc44-2769-4bfe-b8ef-3f6c56f08f74\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlzvf\" (UniqueName: \"kubernetes.io/projected/a7488d00-50bc-4ce8-ae0a-8d3ff807c0da-kube-api-access-zlzvf\") pod \"heat-operator-controller-manager-77b6666d85-q5nj7\" (UID: \"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr89w\" (UniqueName: \"kubernetes.io/projected/2747d064-d45f-4a4e-87c2-d2c9f82eac10-kube-api-access-zr89w\") pod \"ironic-operator-controller-manager-5bc894d9b-wfltj\" (UID: \"2747d064-d45f-4a4e-87c2-d2c9f82eac10\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.275202 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.280738 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.283225 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.292265 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.293787 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8685q" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.311253 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.312050 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.315430 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-s96ts" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.323721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxdfb\" (UniqueName: \"kubernetes.io/projected/46fb44a5-f567-4f58-80b1-dd70694f9339-kube-api-access-xxdfb\") pod \"designate-operator-controller-manager-66d56f6ff4-h7rc9\" (UID: \"46fb44a5-f567-4f58-80b1-dd70694f9339\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.324071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sznf\" (UniqueName: \"kubernetes.io/projected/bdbe5269-1150-4269-bc28-1d719f1b77b6-kube-api-access-7sznf\") pod \"barbican-operator-controller-manager-d47688694-s8p67\" (UID: \"bdbe5269-1150-4269-bc28-1d719f1b77b6\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.338348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzgl8\" (UniqueName: \"kubernetes.io/projected/dd8df218-c492-4e48-93a9-f5f2dbf7fc00-kube-api-access-rzgl8\") pod \"cinder-operator-controller-manager-984cd4dcf-5plwh\" (UID: \"dd8df218-c492-4e48-93a9-f5f2dbf7fc00\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.338737 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.351887 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.366556 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9wn\" (UniqueName: \"kubernetes.io/projected/460b6997-f558-4e5f-9e15-aa33fece4f4b-kube-api-access-5j9wn\") pod \"horizon-operator-controller-manager-6d9d6b584d-nzdzx\" (UID: \"460b6997-f558-4e5f-9e15-aa33fece4f4b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnq54\" (UniqueName: \"kubernetes.io/projected/5befe4e4-4574-42ac-90ce-ac67c1e33eee-kube-api-access-wnq54\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v\" (UID: \"5befe4e4-4574-42ac-90ce-ac67c1e33eee\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367698 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbqlp\" (UniqueName: \"kubernetes.io/projected/77f24ce6-bc52-4831-902c-255983a8f911-kube-api-access-sbqlp\") pod \"keystone-operator-controller-manager-684f77d66d-5vcsg\" (UID: \"77f24ce6-bc52-4831-902c-255983a8f911\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367774 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbbn\" (UniqueName: \"kubernetes.io/projected/b5a018c4-3e3a-4f77-a272-20c94a5b9c7a-kube-api-access-fsbbn\") pod \"manila-operator-controller-manager-57b484b4df-hlk9s\" (UID: \"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367887 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8qt2\" (UniqueName: \"kubernetes.io/projected/7caf7136-8a46-410b-8a32-72ab19e8baca-kube-api-access-h8qt2\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367978 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krblr\" (UniqueName: \"kubernetes.io/projected/e154cc44-2769-4bfe-b8ef-3f6c56f08f74-kube-api-access-krblr\") pod \"glance-operator-controller-manager-5964f64c48-tzx96\" (UID: \"e154cc44-2769-4bfe-b8ef-3f6c56f08f74\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.368919 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.369016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlzvf\" (UniqueName: \"kubernetes.io/projected/a7488d00-50bc-4ce8-ae0a-8d3ff807c0da-kube-api-access-zlzvf\") pod \"heat-operator-controller-manager-77b6666d85-q5nj7\" (UID: \"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.369091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr89w\" (UniqueName: \"kubernetes.io/projected/2747d064-d45f-4a4e-87c2-d2c9f82eac10-kube-api-access-zr89w\") pod \"ironic-operator-controller-manager-5bc894d9b-wfltj\" (UID: \"2747d064-d45f-4a4e-87c2-d2c9f82eac10\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.369022 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.369233 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:29.869209515 +0000 UTC m=+1000.890325406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.373717 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6h66p" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.392792 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.404258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbqlp\" (UniqueName: \"kubernetes.io/projected/77f24ce6-bc52-4831-902c-255983a8f911-kube-api-access-sbqlp\") pod \"keystone-operator-controller-manager-684f77d66d-5vcsg\" (UID: \"77f24ce6-bc52-4831-902c-255983a8f911\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.406267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krblr\" (UniqueName: \"kubernetes.io/projected/e154cc44-2769-4bfe-b8ef-3f6c56f08f74-kube-api-access-krblr\") pod \"glance-operator-controller-manager-5964f64c48-tzx96\" (UID: \"e154cc44-2769-4bfe-b8ef-3f6c56f08f74\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.406810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr89w\" (UniqueName: \"kubernetes.io/projected/2747d064-d45f-4a4e-87c2-d2c9f82eac10-kube-api-access-zr89w\") pod \"ironic-operator-controller-manager-5bc894d9b-wfltj\" (UID: \"2747d064-d45f-4a4e-87c2-d2c9f82eac10\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.406926 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9wn\" (UniqueName: \"kubernetes.io/projected/460b6997-f558-4e5f-9e15-aa33fece4f4b-kube-api-access-5j9wn\") pod \"horizon-operator-controller-manager-6d9d6b584d-nzdzx\" (UID: \"460b6997-f558-4e5f-9e15-aa33fece4f4b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.409431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8qt2\" (UniqueName: \"kubernetes.io/projected/7caf7136-8a46-410b-8a32-72ab19e8baca-kube-api-access-h8qt2\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.412018 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlzvf\" (UniqueName: \"kubernetes.io/projected/a7488d00-50bc-4ce8-ae0a-8d3ff807c0da-kube-api-access-zlzvf\") pod \"heat-operator-controller-manager-77b6666d85-q5nj7\" (UID: \"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.421756 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.432711 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.433804 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.439609 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.440125 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rxwr4" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.446279 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.447172 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.452795 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9fht8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.464951 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2hzf\" (UniqueName: \"kubernetes.io/projected/403c2990-8871-47da-abd8-8c9fc5753d54-kube-api-access-g2hzf\") pod \"octavia-operator-controller-manager-5f4f55cb5c-tbbfl\" (UID: \"403c2990-8871-47da-abd8-8c9fc5753d54\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbvd\" (UniqueName: \"kubernetes.io/projected/499aa973-6f5e-4229-9282-52c4fbf0625f-kube-api-access-7jbvd\") pod \"neutron-operator-controller-manager-776c5696bf-dxntp\" (UID: \"499aa973-6f5e-4229-9282-52c4fbf0625f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnq54\" (UniqueName: \"kubernetes.io/projected/5befe4e4-4574-42ac-90ce-ac67c1e33eee-kube-api-access-wnq54\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v\" (UID: \"5befe4e4-4574-42ac-90ce-ac67c1e33eee\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470326 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbbn\" (UniqueName: \"kubernetes.io/projected/b5a018c4-3e3a-4f77-a272-20c94a5b9c7a-kube-api-access-fsbbn\") pod \"manila-operator-controller-manager-57b484b4df-hlk9s\" (UID: \"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxs5\" (UniqueName: \"kubernetes.io/projected/386f7e46-c2e3-4eae-aa82-05075883c889-kube-api-access-cpxs5\") pod \"nova-operator-controller-manager-7f84474648-b8lpj\" (UID: \"386f7e46-c2e3-4eae-aa82-05075883c889\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.481657 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.482642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.492673 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5zqv4" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.497591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbbn\" (UniqueName: \"kubernetes.io/projected/b5a018c4-3e3a-4f77-a272-20c94a5b9c7a-kube-api-access-fsbbn\") pod \"manila-operator-controller-manager-57b484b4df-hlk9s\" (UID: \"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.503235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnq54\" (UniqueName: \"kubernetes.io/projected/5befe4e4-4574-42ac-90ce-ac67c1e33eee-kube-api-access-wnq54\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v\" (UID: \"5befe4e4-4574-42ac-90ce-ac67c1e33eee\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.503288 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.525004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.525658 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.529601 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.540041 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-nt4tx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.543326 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.552454 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.590768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9hr\" (UniqueName: \"kubernetes.io/projected/b1273818-139a-4213-b23c-609a7305c92f-kube-api-access-sp9hr\") pod \"ovn-operator-controller-manager-bbc5b68f9-hwdv8\" (UID: \"b1273818-139a-4213-b23c-609a7305c92f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.590844 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.591350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslbn\" (UniqueName: \"kubernetes.io/projected/b36f993b-25cd-4f12-bf48-77bf6f4cf26b-kube-api-access-sslbn\") pod \"placement-operator-controller-manager-574d45c66c-c9lbv\" (UID: \"b36f993b-25cd-4f12-bf48-77bf6f4cf26b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.591768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28d98\" (UniqueName: \"kubernetes.io/projected/5622f52e-2e94-41ca-a9d2-a0c833895937-kube-api-access-28d98\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.592207 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2hzf\" (UniqueName: \"kubernetes.io/projected/403c2990-8871-47da-abd8-8c9fc5753d54-kube-api-access-g2hzf\") pod \"octavia-operator-controller-manager-5f4f55cb5c-tbbfl\" (UID: \"403c2990-8871-47da-abd8-8c9fc5753d54\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.592744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbvd\" (UniqueName: \"kubernetes.io/projected/499aa973-6f5e-4229-9282-52c4fbf0625f-kube-api-access-7jbvd\") pod \"neutron-operator-controller-manager-776c5696bf-dxntp\" (UID: \"499aa973-6f5e-4229-9282-52c4fbf0625f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.592874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxs5\" (UniqueName: \"kubernetes.io/projected/386f7e46-c2e3-4eae-aa82-05075883c889-kube-api-access-cpxs5\") pod \"nova-operator-controller-manager-7f84474648-b8lpj\" (UID: \"386f7e46-c2e3-4eae-aa82-05075883c889\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.593233 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.597770 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.598292 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.598453 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.609841 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.615158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2hzf\" (UniqueName: \"kubernetes.io/projected/403c2990-8871-47da-abd8-8c9fc5753d54-kube-api-access-g2hzf\") pod \"octavia-operator-controller-manager-5f4f55cb5c-tbbfl\" (UID: \"403c2990-8871-47da-abd8-8c9fc5753d54\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.618866 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.623105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbvd\" (UniqueName: \"kubernetes.io/projected/499aa973-6f5e-4229-9282-52c4fbf0625f-kube-api-access-7jbvd\") pod \"neutron-operator-controller-manager-776c5696bf-dxntp\" (UID: \"499aa973-6f5e-4229-9282-52c4fbf0625f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.629324 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.632426 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxs5\" (UniqueName: \"kubernetes.io/projected/386f7e46-c2e3-4eae-aa82-05075883c889-kube-api-access-cpxs5\") pod \"nova-operator-controller-manager-7f84474648-b8lpj\" (UID: \"386f7e46-c2e3-4eae-aa82-05075883c889\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.663606 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.676925 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.677733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.680923 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ls2zb" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.700196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sslbn\" (UniqueName: \"kubernetes.io/projected/b36f993b-25cd-4f12-bf48-77bf6f4cf26b-kube-api-access-sslbn\") pod \"placement-operator-controller-manager-574d45c66c-c9lbv\" (UID: \"b36f993b-25cd-4f12-bf48-77bf6f4cf26b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.700420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28d98\" (UniqueName: \"kubernetes.io/projected/5622f52e-2e94-41ca-a9d2-a0c833895937-kube-api-access-28d98\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.704778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.706034 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.706212 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.707311 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.708277 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkr4g\" (UniqueName: \"kubernetes.io/projected/0244e4ae-2ccd-482a-b490-58a8e46ab53d-kube-api-access-zkr4g\") pod \"swift-operator-controller-manager-7f9cc5dd44-ppzzz\" (UID: \"0244e4ae-2ccd-482a-b490-58a8e46ab53d\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.709584 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9hr\" (UniqueName: \"kubernetes.io/projected/b1273818-139a-4213-b23c-609a7305c92f-kube-api-access-sp9hr\") pod \"ovn-operator-controller-manager-bbc5b68f9-hwdv8\" (UID: \"b1273818-139a-4213-b23c-609a7305c92f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.709648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.709774 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.709835 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.209822025 +0000 UTC m=+1001.230937916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.711427 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.714197 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-kbjrq" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.728748 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslbn\" (UniqueName: \"kubernetes.io/projected/b36f993b-25cd-4f12-bf48-77bf6f4cf26b-kube-api-access-sslbn\") pod \"placement-operator-controller-manager-574d45c66c-c9lbv\" (UID: \"b36f993b-25cd-4f12-bf48-77bf6f4cf26b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.746190 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9hr\" (UniqueName: \"kubernetes.io/projected/b1273818-139a-4213-b23c-609a7305c92f-kube-api-access-sp9hr\") pod \"ovn-operator-controller-manager-bbc5b68f9-hwdv8\" (UID: \"b1273818-139a-4213-b23c-609a7305c92f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.747676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28d98\" (UniqueName: \"kubernetes.io/projected/5622f52e-2e94-41ca-a9d2-a0c833895937-kube-api-access-28d98\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.750398 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.797006 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.816455 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.817421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.818834 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkr4g\" (UniqueName: \"kubernetes.io/projected/0244e4ae-2ccd-482a-b490-58a8e46ab53d-kube-api-access-zkr4g\") pod \"swift-operator-controller-manager-7f9cc5dd44-ppzzz\" (UID: \"0244e4ae-2ccd-482a-b490-58a8e46ab53d\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.818888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ql5n\" (UniqueName: \"kubernetes.io/projected/2032df10-91a5-4a88-9705-c355f50a5024-kube-api-access-5ql5n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-f8l4s\" (UID: \"2032df10-91a5-4a88-9705-c355f50a5024\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.829642 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dsggs" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.830573 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.858491 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.859559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.864479 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.864680 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.864888 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hzb9g" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.866788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.868244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkr4g\" (UniqueName: \"kubernetes.io/projected/0244e4ae-2ccd-482a-b490-58a8e46ab53d-kube-api-access-zkr4g\") pod \"swift-operator-controller-manager-7f9cc5dd44-ppzzz\" (UID: \"0244e4ae-2ccd-482a-b490-58a8e46ab53d\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.873983 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.880096 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.917140 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920655 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwnz\" (UniqueName: \"kubernetes.io/projected/47bdfeda-c97a-40b5-82f8-1008ba20e75b-kube-api-access-rbwnz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-5689f\" (UID: \"47bdfeda-c97a-40b5-82f8-1008ba20e75b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9zgd\" (UniqueName: \"kubernetes.io/projected/a36ba835-deb4-41f5-9b6a-57d1e577c8b1-kube-api-access-q9zgd\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfb9g\" (UID: \"a36ba835-deb4-41f5-9b6a-57d1e577c8b1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920718 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ql5n\" (UniqueName: \"kubernetes.io/projected/2032df10-91a5-4a88-9705-c355f50a5024-kube-api-access-5ql5n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-f8l4s\" (UID: \"2032df10-91a5-4a88-9705-c355f50a5024\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920823 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdqpb\" (UniqueName: \"kubernetes.io/projected/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-kube-api-access-sdqpb\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.920976 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.921035 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.921017835 +0000 UTC m=+1001.942133726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.946023 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.946873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.949247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ql5n\" (UniqueName: \"kubernetes.io/projected/2032df10-91a5-4a88-9705-c355f50a5024-kube-api-access-5ql5n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-f8l4s\" (UID: \"2032df10-91a5-4a88-9705-c355f50a5024\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.952680 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.956124 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sn67s" Mar 13 20:44:29 crc kubenswrapper[4790]: W0313 20:44:29.961490 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7488d00_50bc_4ce8_ae0a_8d3ff807c0da.slice/crio-88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b WatchSource:0}: Error finding container 88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b: Status 404 returned error can't find the container with id 88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.985095 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.023899 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdqpb\" (UniqueName: \"kubernetes.io/projected/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-kube-api-access-sdqpb\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024637 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024714 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwnz\" (UniqueName: \"kubernetes.io/projected/47bdfeda-c97a-40b5-82f8-1008ba20e75b-kube-api-access-rbwnz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-5689f\" (UID: \"47bdfeda-c97a-40b5-82f8-1008ba20e75b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024747 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9zgd\" (UniqueName: \"kubernetes.io/projected/a36ba835-deb4-41f5-9b6a-57d1e577c8b1-kube-api-access-q9zgd\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfb9g\" (UID: \"a36ba835-deb4-41f5-9b6a-57d1e577c8b1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.024821 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.024906 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.524882823 +0000 UTC m=+1001.545998784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.025071 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.025101 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.525092089 +0000 UTC m=+1001.546208080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.057296 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwnz\" (UniqueName: \"kubernetes.io/projected/47bdfeda-c97a-40b5-82f8-1008ba20e75b-kube-api-access-rbwnz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-5689f\" (UID: \"47bdfeda-c97a-40b5-82f8-1008ba20e75b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.067177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9zgd\" (UniqueName: \"kubernetes.io/projected/a36ba835-deb4-41f5-9b6a-57d1e577c8b1-kube-api-access-q9zgd\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfb9g\" (UID: \"a36ba835-deb4-41f5-9b6a-57d1e577c8b1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.090038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdqpb\" (UniqueName: \"kubernetes.io/projected/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-kube-api-access-sdqpb\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.125964 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j87jp\" (UniqueName: \"kubernetes.io/projected/22e6d110-bd87-4d28-851d-307b4223ee8f-kube-api-access-j87jp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvrl9\" (UID: \"22e6d110-bd87-4d28-851d-307b4223ee8f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.167648 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.179221 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.190663 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.228203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.228319 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j87jp\" (UniqueName: \"kubernetes.io/projected/22e6d110-bd87-4d28-851d-307b4223ee8f-kube-api-access-j87jp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvrl9\" (UID: \"22e6d110-bd87-4d28-851d-307b4223ee8f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.228442 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.228486 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:31.228472655 +0000 UTC m=+1002.249588546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.238779 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod460b6997_f558_4e5f_9e15_aa33fece4f4b.slice/crio-69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20 WatchSource:0}: Error finding container 69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20: Status 404 returned error can't find the container with id 69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.249955 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j87jp\" (UniqueName: \"kubernetes.io/projected/22e6d110-bd87-4d28-851d-307b4223ee8f-kube-api-access-j87jp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvrl9\" (UID: \"22e6d110-bd87-4d28-851d-307b4223ee8f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.341844 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.370144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.457050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.465980 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.471264 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8p67"] Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.516549 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a018c4_3e3a_4f77_a272_20c94a5b9c7a.slice/crio-2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae WatchSource:0}: Error finding container 2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae: Status 404 returned error can't find the container with id 2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.538869 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbe5269_1150_4269_bc28_1d719f1b77b6.slice/crio-7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5 WatchSource:0}: Error finding container 7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5: Status 404 returned error can't find the container with id 7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.552832 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.552938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553124 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553189 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:31.553174804 +0000 UTC m=+1002.574290695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553668 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553702 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:31.553693619 +0000 UTC m=+1002.574809510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.597478 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.623141 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.843330 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.857661 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96"] Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.874988 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5befe4e4_4574_42ac_90ce_ac67c1e33eee.slice/crio-fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971 WatchSource:0}: Error finding container fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971: Status 404 returned error can't find the container with id fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.876996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" event={"ID":"dd8df218-c492-4e48-93a9-f5f2dbf7fc00","Type":"ContainerStarted","Data":"c9228b2b9361a604f58db4a2eef1f2e90da985b067f1754db68fef1126f2ae05"} Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.878727 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode154cc44_2769_4bfe_b8ef_3f6c56f08f74.slice/crio-750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673 WatchSource:0}: Error finding container 750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673: Status 404 returned error can't find the container with id 750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.879253 4790 generic.go:334] "Generic (PLEG): container finished" podID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerID="e192663f06dfb187428edb1e170aca9856113025c267275042a42fcf172697f7" exitCode=0 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.879323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"e192663f06dfb187428edb1e170aca9856113025c267275042a42fcf172697f7"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.880322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" event={"ID":"77f24ce6-bc52-4831-902c-255983a8f911","Type":"ContainerStarted","Data":"dbd1091664789a5509e9f7034d0926af7220a21a6ea02635b0173a9b600c6c2d"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.882248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" event={"ID":"460b6997-f558-4e5f-9e15-aa33fece4f4b","Type":"ContainerStarted","Data":"69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.888367 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" event={"ID":"bdbe5269-1150-4269-bc28-1d719f1b77b6","Type":"ContainerStarted","Data":"7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.894845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" event={"ID":"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a","Type":"ContainerStarted","Data":"2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.896536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" event={"ID":"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da","Type":"ContainerStarted","Data":"88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.900312 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" event={"ID":"46fb44a5-f567-4f58-80b1-dd70694f9339","Type":"ContainerStarted","Data":"b5ba93f1f107573b0d70724ba7c4e4057f7c5ac0a7a5d323a07d9c2217790481"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.901696 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" event={"ID":"2747d064-d45f-4a4e-87c2-d2c9f82eac10","Type":"ContainerStarted","Data":"ea6092c7addb8875e9e417beb7252e64ffc37b00aab65b67c59881bb119fb4a5"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.961049 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.961225 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.972324 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:32.972280685 +0000 UTC m=+1003.993396666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.973651 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.007743 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.013911 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl"] Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.025605 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499aa973_6f5e_4229_9282_52c4fbf0625f.slice/crio-73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8 WatchSource:0}: Error finding container 73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8: Status 404 returned error can't find the container with id 73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8 Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.028571 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403c2990_8871_47da_abd8_8c9fc5753d54.slice/crio-fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7 WatchSource:0}: Error finding container fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7: Status 404 returned error can't find the container with id fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7 Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.105133 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.113328 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.122805 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.132543 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz"] Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.137581 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sp9hr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-hwdv8_openstack-operators(b1273818-139a-4213-b23c-609a7305c92f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.139292 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podUID="b1273818-139a-4213-b23c-609a7305c92f" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.144487 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s"] Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.149172 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2032df10_91a5_4a88_9705_c355f50a5024.slice/crio-5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662 WatchSource:0}: Error finding container 5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662: Status 404 returned error can't find the container with id 5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662 Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.151967 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g"] Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.152121 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb36f993b_25cd_4f12_bf48_77bf6f4cf26b.slice/crio-f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16 WatchSource:0}: Error finding container f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16: Status 404 returned error can't find the container with id f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16 Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.154732 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5ql5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6854b8b9d9-f8l4s_openstack-operators(2032df10-91a5-4a88-9705-c355f50a5024): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.154740 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sslbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-c9lbv_openstack-operators(b36f993b-25cd-4f12-bf48-77bf6f4cf26b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.155922 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podUID="b36f993b-25cd-4f12-bf48-77bf6f4cf26b" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.155926 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podUID="2032df10-91a5-4a88-9705-c355f50a5024" Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.172141 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47bdfeda_c97a_40b5_82f8_1008ba20e75b.slice/crio-fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061 WatchSource:0}: Error finding container fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061: Status 404 returned error can't find the container with id fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061 Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.175856 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rbwnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-5689f_openstack-operators(47bdfeda-c97a-40b5-82f8-1008ba20e75b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.176745 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0244e4ae_2ccd_482a_b490_58a8e46ab53d.slice/crio-1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8 WatchSource:0}: Error finding container 1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8: Status 404 returned error can't find the container with id 1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8 Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.178527 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podUID="47bdfeda-c97a-40b5-82f8-1008ba20e75b" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.183194 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zkr4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-ppzzz_openstack-operators(0244e4ae-2ccd-482a-b490-58a8e46ab53d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.184444 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podUID="0244e4ae-2ccd-482a-b490-58a8e46ab53d" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.252448 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.266535 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.267035 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.267157 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:33.267138833 +0000 UTC m=+1004.288254724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.569771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.569827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.569967 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.569995 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.570058 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:33.570038921 +0000 UTC m=+1004.591154812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.570076 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:33.570067851 +0000 UTC m=+1004.591183742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.937864 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerStarted","Data":"09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24"} Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.940987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" event={"ID":"0244e4ae-2ccd-482a-b490-58a8e46ab53d","Type":"ContainerStarted","Data":"1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8"} Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.943133 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podUID="0244e4ae-2ccd-482a-b490-58a8e46ab53d" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.950362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" event={"ID":"b1273818-139a-4213-b23c-609a7305c92f","Type":"ContainerStarted","Data":"e7020105e4517bfbe1bb13af02b8b25abb6552ac1ae5b914868dd911d9396e64"} Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.952445 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podUID="b1273818-139a-4213-b23c-609a7305c92f" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.958177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" event={"ID":"499aa973-6f5e-4229-9282-52c4fbf0625f","Type":"ContainerStarted","Data":"73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8"} Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.961083 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jgrz9" podStartSLOduration=2.17809555 podStartE2EDuration="4.961061969s" podCreationTimestamp="2026-03-13 20:44:27 +0000 UTC" firstStartedPulling="2026-03-13 20:44:28.82460865 +0000 UTC m=+999.845724541" lastFinishedPulling="2026-03-13 20:44:31.607575069 +0000 UTC m=+1002.628690960" observedRunningTime="2026-03-13 20:44:31.95631032 +0000 UTC m=+1002.977426231" watchObservedRunningTime="2026-03-13 20:44:31.961061969 +0000 UTC m=+1002.982177860" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.978411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" event={"ID":"403c2990-8871-47da-abd8-8c9fc5753d54","Type":"ContainerStarted","Data":"fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7"} Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.979890 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" event={"ID":"b36f993b-25cd-4f12-bf48-77bf6f4cf26b","Type":"ContainerStarted","Data":"f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16"} Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.992255 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podUID="b36f993b-25cd-4f12-bf48-77bf6f4cf26b" Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:31.996580 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" event={"ID":"a36ba835-deb4-41f5-9b6a-57d1e577c8b1","Type":"ContainerStarted","Data":"7679f460ba89c0e1f9a653df373d65029dac58b10d4b33dbf370a9bd2ca1a341"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.012502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" event={"ID":"22e6d110-bd87-4d28-851d-307b4223ee8f","Type":"ContainerStarted","Data":"22032535d0f3b23460ceb619582eb046ee9d8684867e59095b92daf1e309bab0"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.014279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" event={"ID":"2032df10-91a5-4a88-9705-c355f50a5024","Type":"ContainerStarted","Data":"5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662"} Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.016856 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podUID="2032df10-91a5-4a88-9705-c355f50a5024" Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.023476 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" event={"ID":"386f7e46-c2e3-4eae-aa82-05075883c889","Type":"ContainerStarted","Data":"8826287611f4404be5662201187022cde01073b80013348065ac5caa09fc4463"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.026414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" event={"ID":"47bdfeda-c97a-40b5-82f8-1008ba20e75b","Type":"ContainerStarted","Data":"fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061"} Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.028416 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podUID="47bdfeda-c97a-40b5-82f8-1008ba20e75b" Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.044855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" event={"ID":"5befe4e4-4574-42ac-90ce-ac67c1e33eee","Type":"ContainerStarted","Data":"fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.054074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" event={"ID":"e154cc44-2769-4bfe-b8ef-3f6c56f08f74","Type":"ContainerStarted","Data":"750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.987558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.987767 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.987813 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:36.987798833 +0000 UTC m=+1008.008914724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.065202 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podUID="b1273818-139a-4213-b23c-609a7305c92f" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.065206 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podUID="0244e4ae-2ccd-482a-b490-58a8e46ab53d" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.066561 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podUID="b36f993b-25cd-4f12-bf48-77bf6f4cf26b" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.066708 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podUID="2032df10-91a5-4a88-9705-c355f50a5024" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.066800 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podUID="47bdfeda-c97a-40b5-82f8-1008ba20e75b" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.291531 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.291657 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:37.291618426 +0000 UTC m=+1008.312734317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: I0313 20:44:33.291741 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:33 crc kubenswrapper[4790]: I0313 20:44:33.596297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.596486 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.596569 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:37.596550478 +0000 UTC m=+1008.617666369 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: I0313 20:44:33.596904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.597026 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.597062 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:37.597052902 +0000 UTC m=+1008.618168793 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.051997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.052168 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.052438 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.052419101 +0000 UTC m=+1016.073534992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.356679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.356858 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.356935 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.356916752 +0000 UTC m=+1016.378032643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.614719 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.615048 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.657751 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.660190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.660250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660323 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660417 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.660397485 +0000 UTC m=+1016.681513386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660323 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660485 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.660466587 +0000 UTC m=+1016.681582478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:38 crc kubenswrapper[4790]: I0313 20:44:38.143822 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:38 crc kubenswrapper[4790]: I0313 20:44:38.202713 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:40 crc kubenswrapper[4790]: I0313 20:44:40.111419 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jgrz9" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" containerID="cri-o://09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24" gracePeriod=2 Mar 13 20:44:41 crc kubenswrapper[4790]: I0313 20:44:41.120710 4790 generic.go:334] "Generic (PLEG): container finished" podID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerID="09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24" exitCode=0 Mar 13 20:44:41 crc kubenswrapper[4790]: I0313 20:44:41.120805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24"} Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.015249 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.015593 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.015641 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.016232 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.016284 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf" gracePeriod=600 Mar 13 20:44:44 crc kubenswrapper[4790]: E0313 20:44:44.897512 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60" Mar 13 20:44:44 crc kubenswrapper[4790]: E0313 20:44:44.897694 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krblr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5964f64c48-tzx96_openstack-operators(e154cc44-2769-4bfe-b8ef-3f6c56f08f74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:44 crc kubenswrapper[4790]: E0313 20:44:44.898862 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" podUID="e154cc44-2769-4bfe-b8ef-3f6c56f08f74" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.076632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.100827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.106511 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.150543 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf" exitCode=0 Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.150805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf"} Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.150879 4790 scope.go:117] "RemoveContainer" containerID="79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.152135 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" podUID="e154cc44-2769-4bfe-b8ef-3f6c56f08f74" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.381719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.385500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.395106 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.530433 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.530602 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jbvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-dxntp_openstack-operators(499aa973-6f5e-4229-9282-52c4fbf0625f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.538180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" podUID="499aa973-6f5e-4229-9282-52c4fbf0625f" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.559471 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.685552 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.685759 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.685842 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.686105 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.686148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686292 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686352 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:45:01.6863344 +0000 UTC m=+1032.707450291 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686291 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686453 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:45:01.686435003 +0000 UTC m=+1032.707550894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.687011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities" (OuterVolumeSpecName: "utilities") pod "8883fcbc-75ff-43e3-8088-f2ba848e9d3a" (UID: "8883fcbc-75ff-43e3-8088-f2ba848e9d3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.694034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n" (OuterVolumeSpecName: "kube-api-access-gf67n") pod "8883fcbc-75ff-43e3-8088-f2ba848e9d3a" (UID: "8883fcbc-75ff-43e3-8088-f2ba848e9d3a"). InnerVolumeSpecName "kube-api-access-gf67n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.736039 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8883fcbc-75ff-43e3-8088-f2ba848e9d3a" (UID: "8883fcbc-75ff-43e3-8088-f2ba848e9d3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.787250 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.787286 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.787299 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.137149 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.137341 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xxdfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66d56f6ff4-h7rc9_openstack-operators(46fb44a5-f567-4f58-80b1-dd70694f9339): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.138570 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" podUID="46fb44a5-f567-4f58-80b1-dd70694f9339" Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.157264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"94c84ec1662023adbd79b891587ec02bac606782a1b69fbe98e2395146aadf04"} Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.157287 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.158780 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" podUID="46fb44a5-f567-4f58-80b1-dd70694f9339" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.159597 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" podUID="499aa973-6f5e-4229-9282-52c4fbf0625f" Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.204141 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.209886 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.720903 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.721116 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7sznf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-d47688694-s8p67_openstack-operators(bdbe5269-1150-4269-bc28-1d719f1b77b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.722521 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" podUID="bdbe5269-1150-4269-bc28-1d719f1b77b6" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.208217 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" podUID="bdbe5269-1150-4269-bc28-1d719f1b77b6" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.362466 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.362646 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wnq54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v_openstack-operators(5befe4e4-4574-42ac-90ce-ac67c1e33eee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.364008 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" podUID="5befe4e4-4574-42ac-90ce-ac67c1e33eee" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.673561 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" path="/var/lib/kubelet/pods/8883fcbc-75ff-43e3-8088-f2ba848e9d3a/volumes" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807051 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.807804 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-content" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807828 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-content" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.807900 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807912 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.807927 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-utilities" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807936 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-utilities" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.808236 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.809455 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.816514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.932674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.932757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.932855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034260 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.051556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.131569 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.200125 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" podUID="5befe4e4-4574-42ac-90ce-ac67c1e33eee" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.526250 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.526425 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sbqlp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-5vcsg_openstack-operators(77f24ce6-bc52-4831-902c-255983a8f911): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.527727 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" podUID="77f24ce6-bc52-4831-902c-255983a8f911" Mar 13 20:44:49 crc kubenswrapper[4790]: E0313 20:44:49.206651 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" podUID="77f24ce6-bc52-4831-902c-255983a8f911" Mar 13 20:44:50 crc kubenswrapper[4790]: I0313 20:44:50.952277 4790 scope.go:117] "RemoveContainer" containerID="09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24" Mar 13 20:44:53 crc kubenswrapper[4790]: E0313 20:44:53.774804 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff" Mar 13 20:44:53 crc kubenswrapper[4790]: E0313 20:44:53.775573 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cpxs5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-b8lpj_openstack-operators(386f7e46-c2e3-4eae-aa82-05075883c889): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:53 crc kubenswrapper[4790]: E0313 20:44:53.777256 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" podUID="386f7e46-c2e3-4eae-aa82-05075883c889" Mar 13 20:44:54 crc kubenswrapper[4790]: E0313 20:44:54.250827 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" podUID="386f7e46-c2e3-4eae-aa82-05075883c889" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.101298 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.101492 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j87jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xvrl9_openstack-operators(22e6d110-bd87-4d28-851d-307b4223ee8f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.103589 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" podUID="22e6d110-bd87-4d28-851d-307b4223ee8f" Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.162275 4790 scope.go:117] "RemoveContainer" containerID="e192663f06dfb187428edb1e170aca9856113025c267275042a42fcf172697f7" Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.234916 4790 scope.go:117] "RemoveContainer" containerID="13883d616d7859b8b1f4e3643b2470ceb4a60d0faba96109c31a1ecc31533caa" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.266025 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" podUID="22e6d110-bd87-4d28-851d-307b4223ee8f" Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.461147 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn"] Mar 13 20:44:55 crc kubenswrapper[4790]: W0313 20:44:55.466642 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5622f52e_2e94_41ca_a9d2_a0c833895937.slice/crio-cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25 WatchSource:0}: Error finding container cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25: Status 404 returned error can't find the container with id cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25 Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.573740 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h"] Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.704697 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.272320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.274605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" event={"ID":"5622f52e-2e94-41ca-a9d2-a0c833895937","Type":"ContainerStarted","Data":"cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.281487 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" event={"ID":"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a","Type":"ContainerStarted","Data":"10bdaf4d26e7e5fe23d771ead6638f0b751f8f939d42553bd683fa795643fc52"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.281613 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.299623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" event={"ID":"b36f993b-25cd-4f12-bf48-77bf6f4cf26b","Type":"ContainerStarted","Data":"8ab77fc960301017ff2965634df34952f9217a4c67bd87ec0a32f41213570074"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.300094 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.337057 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerID="fb40f1dd553eeada38a0eebd58e1e4a584d8b04e5e145194a1581e6f7877058c" exitCode=0 Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.337128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"fb40f1dd553eeada38a0eebd58e1e4a584d8b04e5e145194a1581e6f7877058c"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.337153 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerStarted","Data":"2c5f243728839b68f10d728f37eb16ef3d5e7896648c916636341303be93e6ac"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.344774 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" event={"ID":"a36ba835-deb4-41f5-9b6a-57d1e577c8b1","Type":"ContainerStarted","Data":"163b0e72972fc5ab793eec150a56f0e470a1cfcd2f192f73d3752fdace53e20a"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.346013 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.366867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" event={"ID":"403c2990-8871-47da-abd8-8c9fc5753d54","Type":"ContainerStarted","Data":"2059239a14ccd1b0820091782feb06972de1bdde603a4ed32224a99486a598ac"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.367620 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.383272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" event={"ID":"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da","Type":"ContainerStarted","Data":"38862ca9cc80e3d7be5dd86265607af3e9603c6af6bfa840e8077bc0e61a6f76"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.383904 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.403507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" event={"ID":"0244e4ae-2ccd-482a-b490-58a8e46ab53d","Type":"ContainerStarted","Data":"1527737f5112aff2b5a9800a20cb1658d3bcc389c6c26a39fc763408541f7ab1"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.404077 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.409108 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" event={"ID":"dd8df218-c492-4e48-93a9-f5f2dbf7fc00","Type":"ContainerStarted","Data":"46b6eb30288ceb9db61be39bd00d47823550dd3ec2e1610ed5dbb25d29e279ed"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.409834 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.411123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" event={"ID":"47bdfeda-c97a-40b5-82f8-1008ba20e75b","Type":"ContainerStarted","Data":"b9100d0a09419790d3788fe172d9b1e725ba5b7571b08ce9361b2a7a344df5fd"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.411619 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.412939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" event={"ID":"2747d064-d45f-4a4e-87c2-d2c9f82eac10","Type":"ContainerStarted","Data":"4b4149598dab0644f4b7235becb2e775cb2d1423853250057db34f1dbc9605e3"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.413466 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.414650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" event={"ID":"2032df10-91a5-4a88-9705-c355f50a5024","Type":"ContainerStarted","Data":"54231c927763897d95b75b93d1914fe37cf1bc7bb82c53c3e52083ee1929070e"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.415108 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.432356 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podStartSLOduration=3.369006536 podStartE2EDuration="27.432337337s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.154504178 +0000 UTC m=+1002.175620069" lastFinishedPulling="2026-03-13 20:44:55.217834979 +0000 UTC m=+1026.238950870" observedRunningTime="2026-03-13 20:44:56.400792621 +0000 UTC m=+1027.421908522" watchObservedRunningTime="2026-03-13 20:44:56.432337337 +0000 UTC m=+1027.453453228" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.449834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" event={"ID":"460b6997-f558-4e5f-9e15-aa33fece4f4b","Type":"ContainerStarted","Data":"e0f4fd3d12b0110991e073cf73d1b83026cd9e0942ddaa63adbc72fd664a0ed0"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.450585 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.466030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" event={"ID":"7caf7136-8a46-410b-8a32-72ab19e8baca","Type":"ContainerStarted","Data":"7fff8705da681634b83077f3798d4c96c7f4199628e55122f1056dbc03d811fc"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.489669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" event={"ID":"b1273818-139a-4213-b23c-609a7305c92f","Type":"ContainerStarted","Data":"5b405895cbc2593f65b40d1a9e1a3c335cd77a812d3f01402ff4c60e9e14af0c"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.490453 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.528561 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" podStartSLOduration=7.109120341 podStartE2EDuration="27.528540767s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.521870155 +0000 UTC m=+1001.542986046" lastFinishedPulling="2026-03-13 20:44:50.941290581 +0000 UTC m=+1021.962406472" observedRunningTime="2026-03-13 20:44:56.479465185 +0000 UTC m=+1027.500581076" watchObservedRunningTime="2026-03-13 20:44:56.528540767 +0000 UTC m=+1027.549656648" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.537960 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" podStartSLOduration=6.838166571 podStartE2EDuration="27.537942272s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.241562861 +0000 UTC m=+1001.262678752" lastFinishedPulling="2026-03-13 20:44:50.941338512 +0000 UTC m=+1021.962454453" observedRunningTime="2026-03-13 20:44:56.523955522 +0000 UTC m=+1027.545071413" watchObservedRunningTime="2026-03-13 20:44:56.537942272 +0000 UTC m=+1027.559058163" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.604869 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" podStartSLOduration=7.71249584 podStartE2EDuration="27.604849267s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.038957124 +0000 UTC m=+1002.060073015" lastFinishedPulling="2026-03-13 20:44:50.931310501 +0000 UTC m=+1021.952426442" observedRunningTime="2026-03-13 20:44:56.548592781 +0000 UTC m=+1027.569708672" watchObservedRunningTime="2026-03-13 20:44:56.604849267 +0000 UTC m=+1027.625965148" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.610492 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podStartSLOduration=3.54642724 podStartE2EDuration="27.61047782s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.15457488 +0000 UTC m=+1002.175690781" lastFinishedPulling="2026-03-13 20:44:55.21862547 +0000 UTC m=+1026.239741361" observedRunningTime="2026-03-13 20:44:56.595885194 +0000 UTC m=+1027.617001085" watchObservedRunningTime="2026-03-13 20:44:56.61047782 +0000 UTC m=+1027.631593701" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.624110 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" podStartSLOduration=6.720485607 podStartE2EDuration="27.624096209s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.027627137 +0000 UTC m=+1001.048743028" lastFinishedPulling="2026-03-13 20:44:50.931237739 +0000 UTC m=+1021.952353630" observedRunningTime="2026-03-13 20:44:56.623712239 +0000 UTC m=+1027.644828120" watchObservedRunningTime="2026-03-13 20:44:56.624096209 +0000 UTC m=+1027.645212100" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.656388 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" podStartSLOduration=8.212864364 podStartE2EDuration="27.656363094s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.257512474 +0000 UTC m=+1001.278628365" lastFinishedPulling="2026-03-13 20:44:49.701011204 +0000 UTC m=+1020.722127095" observedRunningTime="2026-03-13 20:44:56.652750637 +0000 UTC m=+1027.673866528" watchObservedRunningTime="2026-03-13 20:44:56.656363094 +0000 UTC m=+1027.677478975" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.690069 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" podStartSLOduration=8.408657096 podStartE2EDuration="28.690052478s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.651147662 +0000 UTC m=+1001.672263553" lastFinishedPulling="2026-03-13 20:44:50.932543044 +0000 UTC m=+1021.953658935" observedRunningTime="2026-03-13 20:44:56.684155488 +0000 UTC m=+1027.705271379" watchObservedRunningTime="2026-03-13 20:44:56.690052478 +0000 UTC m=+1027.711168369" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.726172 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" podStartSLOduration=7.891147477 podStartE2EDuration="27.726154308s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.133410326 +0000 UTC m=+1002.154526217" lastFinishedPulling="2026-03-13 20:44:50.968417157 +0000 UTC m=+1021.989533048" observedRunningTime="2026-03-13 20:44:56.717622676 +0000 UTC m=+1027.738738567" watchObservedRunningTime="2026-03-13 20:44:56.726154308 +0000 UTC m=+1027.747270199" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.785664 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podStartSLOduration=3.750913857 podStartE2EDuration="27.785646202s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.183036642 +0000 UTC m=+1002.204152543" lastFinishedPulling="2026-03-13 20:44:55.217768997 +0000 UTC m=+1026.238884888" observedRunningTime="2026-03-13 20:44:56.754329002 +0000 UTC m=+1027.775444893" watchObservedRunningTime="2026-03-13 20:44:56.785646202 +0000 UTC m=+1027.806762093" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.805295 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podStartSLOduration=3.769389989 podStartE2EDuration="27.805268775s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.175682633 +0000 UTC m=+1002.196798524" lastFinishedPulling="2026-03-13 20:44:55.211561419 +0000 UTC m=+1026.232677310" observedRunningTime="2026-03-13 20:44:56.804838852 +0000 UTC m=+1027.825954753" watchObservedRunningTime="2026-03-13 20:44:56.805268775 +0000 UTC m=+1027.826384666" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.834077 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podStartSLOduration=3.753683743 podStartE2EDuration="27.834060966s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.137431185 +0000 UTC m=+1002.158547076" lastFinishedPulling="2026-03-13 20:44:55.217808408 +0000 UTC m=+1026.238924299" observedRunningTime="2026-03-13 20:44:56.8320068 +0000 UTC m=+1027.853122691" watchObservedRunningTime="2026-03-13 20:44:56.834060966 +0000 UTC m=+1027.855176857" Mar 13 20:44:57 crc kubenswrapper[4790]: I0313 20:44:57.502528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerStarted","Data":"5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f"} Mar 13 20:44:58 crc kubenswrapper[4790]: I0313 20:44:58.514324 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerID="5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f" exitCode=0 Mar 13 20:44:58 crc kubenswrapper[4790]: I0313 20:44:58.514475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f"} Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.027662 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.152875 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.153763 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.158188 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.158974 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.165128 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.174637 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.239402 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.239455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.239498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.340824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.340878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.340912 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.341733 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.352257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.357053 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.360862 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.481011 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.764889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.766003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.776597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.776949 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.993713 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hzb9g" Mar 13 20:45:02 crc kubenswrapper[4790]: I0313 20:45:02.002463 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:02 crc kubenswrapper[4790]: I0313 20:45:02.651567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd"] Mar 13 20:45:02 crc kubenswrapper[4790]: W0313 20:45:02.662212 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0c2c50_711c_4fbd_8c15_64bf6fc3572b.slice/crio-f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b WatchSource:0}: Error finding container f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b: Status 404 returned error can't find the container with id f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b Mar 13 20:45:02 crc kubenswrapper[4790]: I0313 20:45:02.711192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.565963 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" event={"ID":"5622f52e-2e94-41ca-a9d2-a0c833895937","Type":"ContainerStarted","Data":"b532c8d4ff76521cedacf0b6e724309f963fdac4a36da5ec87fe020e3821c9c6"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.566464 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.567722 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" event={"ID":"5befe4e4-4574-42ac-90ce-ac67c1e33eee","Type":"ContainerStarted","Data":"d150027c0431ef4b5a35c47118d66fb1a18dbe46a37355f6a03f139f63ee375a"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.568425 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.570043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" event={"ID":"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b","Type":"ContainerStarted","Data":"32f741f9d4ed079d11ada5c34c9ea9b182a33477d0b60355d0c392292d2cdc9f"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.570088 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" event={"ID":"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b","Type":"ContainerStarted","Data":"f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.570522 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.573772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerStarted","Data":"364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.575639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" event={"ID":"bdbe5269-1150-4269-bc28-1d719f1b77b6","Type":"ContainerStarted","Data":"f33338f374d5f656ccba1ad365e80d3c837dbfe86e4c0d15bb451944cd12ab6f"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.576115 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.577416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" event={"ID":"46fb44a5-f567-4f58-80b1-dd70694f9339","Type":"ContainerStarted","Data":"6da7077e28e4ea6eb90d9b1e96e4e21fed0b4f0ce53b7f1a279b46cf64d7d1d4"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.577749 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.579030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" event={"ID":"7caf7136-8a46-410b-8a32-72ab19e8baca","Type":"ContainerStarted","Data":"583c54558feaa4a25a7ffdd55dd6d4247e8e29c1a64aeb290f75da651694992d"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.579469 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.580808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerStarted","Data":"0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.580832 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerStarted","Data":"a5074ae610244688560a689dbf0c1ccfe6efc4d574e04194d1a7f6904e949e82"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.582480 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" event={"ID":"e154cc44-2769-4bfe-b8ef-3f6c56f08f74","Type":"ContainerStarted","Data":"8e0185e92d472f4a6d7bda7359a2cb786124f86991ab9827c4dff79bed91a701"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.582869 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.584050 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" event={"ID":"499aa973-6f5e-4229-9282-52c4fbf0625f","Type":"ContainerStarted","Data":"e9b8a3f9f47d71e1b7fcb34ee4fb4e790a0198bf9f7e737d3a875923a1ecab26"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.584454 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.633780 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xjs8f" podStartSLOduration=10.5920972 podStartE2EDuration="16.633756743s" podCreationTimestamp="2026-03-13 20:44:47 +0000 UTC" firstStartedPulling="2026-03-13 20:44:56.340908037 +0000 UTC m=+1027.362023928" lastFinishedPulling="2026-03-13 20:45:02.38256758 +0000 UTC m=+1033.403683471" observedRunningTime="2026-03-13 20:45:03.630845114 +0000 UTC m=+1034.651961005" watchObservedRunningTime="2026-03-13 20:45:03.633756743 +0000 UTC m=+1034.654872634" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.636760 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" podStartSLOduration=27.823493318 podStartE2EDuration="34.636749234s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:55.471992554 +0000 UTC m=+1026.493108445" lastFinishedPulling="2026-03-13 20:45:02.28524847 +0000 UTC m=+1033.306364361" observedRunningTime="2026-03-13 20:45:03.607625945 +0000 UTC m=+1034.628741856" watchObservedRunningTime="2026-03-13 20:45:03.636749234 +0000 UTC m=+1034.657865125" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.646539 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" podStartSLOduration=3.64652531 podStartE2EDuration="3.64652531s" podCreationTimestamp="2026-03-13 20:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:45:03.643514999 +0000 UTC m=+1034.664630910" watchObservedRunningTime="2026-03-13 20:45:03.64652531 +0000 UTC m=+1034.667641201" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.674756 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" podStartSLOduration=4.247212843 podStartE2EDuration="35.674738915s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.887759192 +0000 UTC m=+1001.908875083" lastFinishedPulling="2026-03-13 20:45:02.315285254 +0000 UTC m=+1033.336401155" observedRunningTime="2026-03-13 20:45:03.67156402 +0000 UTC m=+1034.692679901" watchObservedRunningTime="2026-03-13 20:45:03.674738915 +0000 UTC m=+1034.695854806" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.698399 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" podStartSLOduration=3.94746725 podStartE2EDuration="35.698317395s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.631003356 +0000 UTC m=+1001.652119247" lastFinishedPulling="2026-03-13 20:45:02.381853501 +0000 UTC m=+1033.402969392" observedRunningTime="2026-03-13 20:45:03.695678653 +0000 UTC m=+1034.716794564" watchObservedRunningTime="2026-03-13 20:45:03.698317395 +0000 UTC m=+1034.719433286" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.742479 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" podStartSLOduration=34.742462103 podStartE2EDuration="34.742462103s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:45:03.738203207 +0000 UTC m=+1034.759319118" watchObservedRunningTime="2026-03-13 20:45:03.742462103 +0000 UTC m=+1034.763577994" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.759541 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" podStartSLOduration=28.077902851 podStartE2EDuration="34.759526415s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:55.603522553 +0000 UTC m=+1026.624638444" lastFinishedPulling="2026-03-13 20:45:02.285146097 +0000 UTC m=+1033.306262008" observedRunningTime="2026-03-13 20:45:03.757159571 +0000 UTC m=+1034.778275462" watchObservedRunningTime="2026-03-13 20:45:03.759526415 +0000 UTC m=+1034.780642306" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.800014 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" podStartSLOduration=3.466220394 podStartE2EDuration="34.799990813s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.04804255 +0000 UTC m=+1002.069158441" lastFinishedPulling="2026-03-13 20:45:02.381812969 +0000 UTC m=+1033.402928860" observedRunningTime="2026-03-13 20:45:03.795895872 +0000 UTC m=+1034.817011763" watchObservedRunningTime="2026-03-13 20:45:03.799990813 +0000 UTC m=+1034.821106704" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.819655 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" podStartSLOduration=4.001046414 podStartE2EDuration="35.819636066s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.562701933 +0000 UTC m=+1001.583817824" lastFinishedPulling="2026-03-13 20:45:02.381291585 +0000 UTC m=+1033.402407476" observedRunningTime="2026-03-13 20:45:03.816306216 +0000 UTC m=+1034.837422117" watchObservedRunningTime="2026-03-13 20:45:03.819636066 +0000 UTC m=+1034.840751957" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.855947 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" podStartSLOduration=3.361725488 podStartE2EDuration="34.855923s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.887441773 +0000 UTC m=+1001.908557664" lastFinishedPulling="2026-03-13 20:45:02.381639285 +0000 UTC m=+1033.402755176" observedRunningTime="2026-03-13 20:45:03.850973157 +0000 UTC m=+1034.872089058" watchObservedRunningTime="2026-03-13 20:45:03.855923 +0000 UTC m=+1034.877038891" Mar 13 20:45:04 crc kubenswrapper[4790]: I0313 20:45:04.591764 4790 generic.go:334] "Generic (PLEG): container finished" podID="0001db4d-b91a-473e-bfff-794d8663885f" containerID="0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3" exitCode=0 Mar 13 20:45:04 crc kubenswrapper[4790]: I0313 20:45:04.591855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerDied","Data":"0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3"} Mar 13 20:45:05 crc kubenswrapper[4790]: I0313 20:45:05.908513 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.042857 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"0001db4d-b91a-473e-bfff-794d8663885f\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.042929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"0001db4d-b91a-473e-bfff-794d8663885f\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.042966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"0001db4d-b91a-473e-bfff-794d8663885f\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.043581 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume" (OuterVolumeSpecName: "config-volume") pod "0001db4d-b91a-473e-bfff-794d8663885f" (UID: "0001db4d-b91a-473e-bfff-794d8663885f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.047522 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0001db4d-b91a-473e-bfff-794d8663885f" (UID: "0001db4d-b91a-473e-bfff-794d8663885f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.047597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9" (OuterVolumeSpecName: "kube-api-access-dm7l9") pod "0001db4d-b91a-473e-bfff-794d8663885f" (UID: "0001db4d-b91a-473e-bfff-794d8663885f"). InnerVolumeSpecName "kube-api-access-dm7l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.144866 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.144906 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.144917 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.605730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerDied","Data":"a5074ae610244688560a689dbf0c1ccfe6efc4d574e04194d1a7f6904e949e82"} Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.606022 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5074ae610244688560a689dbf0c1ccfe6efc4d574e04194d1a7f6904e949e82" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.606071 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.132741 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.133066 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.184581 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.652940 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.709462 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.425367 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.467770 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.505497 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.547098 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.604258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.605225 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.605625 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.626000 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.680355 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.712706 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.802745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.869187 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.878422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.920901 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:45:10 crc kubenswrapper[4790]: I0313 20:45:10.630095 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xjs8f" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" containerID="cri-o://364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268" gracePeriod=2 Mar 13 20:45:12 crc kubenswrapper[4790]: I0313 20:45:12.007961 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:12 crc kubenswrapper[4790]: I0313 20:45:12.658084 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerID="364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268" exitCode=0 Mar 13 20:45:12 crc kubenswrapper[4790]: I0313 20:45:12.658152 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268"} Mar 13 20:45:13 crc kubenswrapper[4790]: I0313 20:45:13.667631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" event={"ID":"77f24ce6-bc52-4831-902c-255983a8f911","Type":"ContainerStarted","Data":"9ab2278e56369e3252dff8c8bfa590a2710e8d4ea2e820e8c4aa1f19459dc5c3"} Mar 13 20:45:13 crc kubenswrapper[4790]: I0313 20:45:13.667837 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:45:13 crc kubenswrapper[4790]: I0313 20:45:13.684575 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" podStartSLOduration=2.9125516 podStartE2EDuration="44.684548058s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.600959811 +0000 UTC m=+1001.622075702" lastFinishedPulling="2026-03-13 20:45:12.372956269 +0000 UTC m=+1043.394072160" observedRunningTime="2026-03-13 20:45:13.680453206 +0000 UTC m=+1044.701569097" watchObservedRunningTime="2026-03-13 20:45:13.684548058 +0000 UTC m=+1044.705663949" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.389360 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.459580 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.459640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.459814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.460951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities" (OuterVolumeSpecName: "utilities") pod "f4f563dc-7ac2-4d78-96d4-55d27013fec4" (UID: "f4f563dc-7ac2-4d78-96d4-55d27013fec4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.465274 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2" (OuterVolumeSpecName: "kube-api-access-d2gr2") pod "f4f563dc-7ac2-4d78-96d4-55d27013fec4" (UID: "f4f563dc-7ac2-4d78-96d4-55d27013fec4"). InnerVolumeSpecName "kube-api-access-d2gr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.516713 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f563dc-7ac2-4d78-96d4-55d27013fec4" (UID: "f4f563dc-7ac2-4d78-96d4-55d27013fec4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.561735 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.562109 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.562185 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.674177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" event={"ID":"386f7e46-c2e3-4eae-aa82-05075883c889","Type":"ContainerStarted","Data":"ff240f9b939f7e4ac057b8763a167915e478dd487a4bedb6647509ae575e5640"} Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.675177 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.676611 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"2c5f243728839b68f10d728f37eb16ef3d5e7896648c916636341303be93e6ac"} Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.676627 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.676653 4790 scope.go:117] "RemoveContainer" containerID="364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.678388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" event={"ID":"22e6d110-bd87-4d28-851d-307b4223ee8f","Type":"ContainerStarted","Data":"d74da75256df8f2ddc2083a14d07763c7a2a17638b38e47beb2c9483cd7377b4"} Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.704576 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" podStartSLOduration=2.321262355 podStartE2EDuration="45.704551455s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.001290941 +0000 UTC m=+1002.022406852" lastFinishedPulling="2026-03-13 20:45:14.384580061 +0000 UTC m=+1045.405695952" observedRunningTime="2026-03-13 20:45:14.696676941 +0000 UTC m=+1045.717792832" watchObservedRunningTime="2026-03-13 20:45:14.704551455 +0000 UTC m=+1045.725667346" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.704666 4790 scope.go:117] "RemoveContainer" containerID="5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.723459 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.730612 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.732162 4790 scope.go:117] "RemoveContainer" containerID="fb40f1dd553eeada38a0eebd58e1e4a584d8b04e5e145194a1581e6f7877058c" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.735585 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" podStartSLOduration=2.6264852579999998 podStartE2EDuration="45.735546218s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.272566781 +0000 UTC m=+1002.293682672" lastFinishedPulling="2026-03-13 20:45:14.381627741 +0000 UTC m=+1045.402743632" observedRunningTime="2026-03-13 20:45:14.73011631 +0000 UTC m=+1045.751232201" watchObservedRunningTime="2026-03-13 20:45:14.735546218 +0000 UTC m=+1045.756662109" Mar 13 20:45:15 crc kubenswrapper[4790]: I0313 20:45:15.112817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:45:15 crc kubenswrapper[4790]: I0313 20:45:15.402250 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:45:15 crc kubenswrapper[4790]: I0313 20:45:15.667815 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" path="/var/lib/kubelet/pods/f4f563dc-7ac2-4d78-96d4-55d27013fec4/volumes" Mar 13 20:45:19 crc kubenswrapper[4790]: I0313 20:45:19.529995 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:45:19 crc kubenswrapper[4790]: I0313 20:45:19.753833 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.675931 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.679352 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0001db4d-b91a-473e-bfff-794d8663885f" containerName="collect-profiles" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.679571 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0001db4d-b91a-473e-bfff-794d8663885f" containerName="collect-profiles" Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.679656 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-utilities" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.679739 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-utilities" Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.679827 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-content" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.679900 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-content" Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.680007 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.680081 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.680314 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.680444 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0001db4d-b91a-473e-bfff-794d8663885f" containerName="collect-profiles" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.681428 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.688862 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5lwx2" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.689830 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.690009 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.690173 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.711725 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.744005 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.745458 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.750672 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.756913 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.779270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.779334 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.880640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.880712 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.880969 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.881111 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.881201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.882096 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.899073 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.982303 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.982363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.982420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.983872 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.984139 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.998344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.006702 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.066065 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.421829 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.521780 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.938350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" event={"ID":"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf","Type":"ContainerStarted","Data":"1e13e0bcda642d94f6b249dc823d2fd87698f812917e6d7b60f2ffc56fbe460d"} Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.939763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" event={"ID":"61d662b4-cdc6-4d2f-a8a6-f71db4380caa","Type":"ContainerStarted","Data":"efbaf88a68a9782b2a6db13fe0d06640d640fb6b93b4efe14c9fed10e5292a92"} Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.598815 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.611962 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.613297 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.631271 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.738182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.738236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.738278 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.843968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.844024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.844076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.845156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.845199 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.870575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.882712 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.902244 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.903491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.919940 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.945677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.046791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.046851 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.046915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.149070 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.149355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.149422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.150584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.151124 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.183409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.229259 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.461714 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:45:45 crc kubenswrapper[4790]: W0313 20:45:45.465263 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3603867e_b715_48af_b4d3_248f69035bf4.slice/crio-d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5 WatchSource:0}: Error finding container d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5: Status 404 returned error can't find the container with id d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5 Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.635963 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:45:45 crc kubenswrapper[4790]: W0313 20:45:45.639280 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb63dd900_9f63_4b6a_b620_bd1dfaa88cfe.slice/crio-137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844 WatchSource:0}: Error finding container 137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844: Status 404 returned error can't find the container with id 137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844 Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.777847 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.779816 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.781908 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.781917 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.783134 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.783288 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.784272 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6fg95" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.786018 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.786132 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.786262 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964725 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965242 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965298 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965531 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965562 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.982901 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerStarted","Data":"137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844"} Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.985041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerStarted","Data":"d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5"} Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.051651 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.054298 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057217 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057344 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bssvd" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057421 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057574 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057968 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.072538 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.073071 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077616 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077967 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.078045 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.083662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.083740 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.084744 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.086043 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.087510 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.087844 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.088238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.088313 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.093121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.093717 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.106631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.117021 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.162829 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185689 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185709 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185730 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185773 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185867 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.290787 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.290890 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.291035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.291097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.291127 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.295140 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.295679 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.295792 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.298186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.307414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.312517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.406621 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.518731 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.256993 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.259068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.261640 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.262033 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.263090 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lhxlc" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.263947 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.268464 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.273072 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409788 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-default\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409845 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409873 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410184 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kolla-config\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qp6k\" (UniqueName: \"kubernetes.io/projected/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kube-api-access-4qp6k\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-default\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511755 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kolla-config\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.512000 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.512340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.512933 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-default\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.513000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qp6k\" (UniqueName: \"kubernetes.io/projected/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kube-api-access-4qp6k\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.513025 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kolla-config\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.513272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.517497 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.517554 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.537478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qp6k\" (UniqueName: \"kubernetes.io/projected/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kube-api-access-4qp6k\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.541686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.579674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.604486 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.606131 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.609132 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.609571 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.609682 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.610081 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dqbxb" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.612796 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.728670 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.728969 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729048 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729086 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa2face0-9349-4482-880a-b23cf41099b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729333 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-466bt\" (UniqueName: \"kubernetes.io/projected/fa2face0-9349-4482-880a-b23cf41099b2-kube-api-access-466bt\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729418 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830674 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa2face0-9349-4482-880a-b23cf41099b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830810 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830829 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-466bt\" (UniqueName: \"kubernetes.io/projected/fa2face0-9349-4482-880a-b23cf41099b2-kube-api-access-466bt\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.831779 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.832397 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.832697 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.832705 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa2face0-9349-4482-880a-b23cf41099b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.833316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.849155 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.849762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.851812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-466bt\" (UniqueName: \"kubernetes.io/projected/fa2face0-9349-4482-880a-b23cf41099b2-kube-api-access-466bt\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.852518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.876183 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.877467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.879472 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.879698 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-jhfzc" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.879828 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.889960 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.938288 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.939880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kube-api-access-94gv9\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940028 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940289 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940454 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kolla-config\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-config-data\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kolla-config\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-config-data\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042346 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kube-api-access-94gv9\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.043122 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kolla-config\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.043464 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-config-data\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.045601 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.049852 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.064059 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kube-api-access-94gv9\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.246698 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.076974 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.078546 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.080987 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hthr2" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.091117 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.171351 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"kube-state-metrics-0\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.272213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"kube-state-metrics-0\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.299329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"kube-state-metrics-0\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.407444 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.241412 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vspq5"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.242762 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.248444 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.248500 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2xbsh" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.248685 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.262090 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.320714 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-k7bzr"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.322758 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.329388 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7bzr"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c72ac557-7882-4120-b64a-4343639cc766-scripts\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334618 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-combined-ca-bundle\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334639 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gzg\" (UniqueName: \"kubernetes.io/projected/c72ac557-7882-4120-b64a-4343639cc766-kube-api-access-s5gzg\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334743 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-ovn-controller-tls-certs\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334761 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-log-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438330 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-lib\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-log\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmbq\" (UniqueName: \"kubernetes.io/projected/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-kube-api-access-nlmbq\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gzg\" (UniqueName: \"kubernetes.io/projected/c72ac557-7882-4120-b64a-4343639cc766-kube-api-access-s5gzg\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438515 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-scripts\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-etc-ovs\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438564 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-ovn-controller-tls-certs\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-log-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438614 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c72ac557-7882-4120-b64a-4343639cc766-scripts\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.439183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.439943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-run\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.439977 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-combined-ca-bundle\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.440609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.440794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-log-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.442982 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c72ac557-7882-4120-b64a-4343639cc766-scripts\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.450135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-ovn-controller-tls-certs\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.450749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-combined-ca-bundle\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.459475 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gzg\" (UniqueName: \"kubernetes.io/projected/c72ac557-7882-4120-b64a-4343639cc766-kube-api-access-s5gzg\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.523867 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.524971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.532445 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.532801 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.533399 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.533620 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.533764 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wqthz" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.541218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-scripts\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.541271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-etc-ovs\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.541966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-run\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.542055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-lib\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.543564 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-log\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.543659 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmbq\" (UniqueName: \"kubernetes.io/projected/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-kube-api-access-nlmbq\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544447 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-etc-ovs\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-run\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544602 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-lib\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-log\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.555759 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.559917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-scripts\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.562323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmbq\" (UniqueName: \"kubernetes.io/projected/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-kube-api-access-nlmbq\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.562713 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.637046 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.648758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.648945 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdxh\" (UniqueName: \"kubernetes.io/projected/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-kube-api-access-rgdxh\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649349 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-config\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649716 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751623 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751671 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdxh\" (UniqueName: \"kubernetes.io/projected/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-kube-api-access-rgdxh\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751808 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751853 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-config\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.756728 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.756986 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.758370 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.760080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.764063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.767918 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-config\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.772892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.777108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdxh\" (UniqueName: \"kubernetes.io/projected/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-kube-api-access-rgdxh\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.785213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.860937 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:57 crc kubenswrapper[4790]: I0313 20:45:57.678823 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.674652 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.710463 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.710667 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.714342 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.715112 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.715544 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tcpfk" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.716120 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.716730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834391 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-config\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834841 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvg4t\" (UniqueName: \"kubernetes.io/projected/ba4867dc-70fb-4533-a075-31fc03f7ef33-kube-api-access-lvg4t\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.851992 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.861050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: W0313 20:45:58.872957 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfceb0829_5f0e_4e78_a803_61afc5aa4d60.slice/crio-46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9 WatchSource:0}: Error finding container 46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9: Status 404 returned error can't find the container with id 46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9 Mar 13 20:45:58 crc kubenswrapper[4790]: W0313 20:45:58.902505 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc575f482_56cd_4dfc_84c6_c6bb922d56a9.slice/crio-dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761 WatchSource:0}: Error finding container dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761: Status 404 returned error can't find the container with id dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761 Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.936787 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvg4t\" (UniqueName: \"kubernetes.io/projected/ba4867dc-70fb-4533-a075-31fc03f7ef33-kube-api-access-lvg4t\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-config\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937602 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.938092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.938455 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.939476 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-config\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.939568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.943423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.947165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.949420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.961108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvg4t\" (UniqueName: \"kubernetes.io/projected/ba4867dc-70fb-4533-a075-31fc03f7ef33-kube-api-access-lvg4t\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.972081 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.985869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.992610 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5"] Mar 13 20:45:58 crc kubenswrapper[4790]: W0313 20:45:58.997854 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc72ac557_7882_4120_b64a_4343639cc766.slice/crio-873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca WatchSource:0}: Error finding container 873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca: Status 404 returned error can't find the container with id 873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.002180 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 20:45:59 crc kubenswrapper[4790]: W0313 20:45:59.007656 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3980f8da_ddaa_4634_8c09_1a71ae19c58f.slice/crio-b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673 WatchSource:0}: Error finding container b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673: Status 404 returned error can't find the container with id b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.076015 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 20:45:59 crc kubenswrapper[4790]: W0313 20:45:59.078923 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a24d7e_902f_4862_9c6b_8317f8fb3f29.slice/crio-438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c WatchSource:0}: Error finding container 438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c: Status 404 returned error can't find the container with id 438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.101420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3980f8da-ddaa-4634-8c09-1a71ae19c58f","Type":"ContainerStarted","Data":"b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.110291 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerStarted","Data":"dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.123948 4790 generic.go:334] "Generic (PLEG): container finished" podID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerID="d5635f334bb1d0f55f8df6048568c51547f61cdf8fa854744c6f631fac79f9eb" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.124004 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" event={"ID":"61d662b4-cdc6-4d2f-a8a6-f71db4380caa","Type":"ContainerDied","Data":"d5635f334bb1d0f55f8df6048568c51547f61cdf8fa854744c6f631fac79f9eb"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.125966 4790 generic.go:334] "Generic (PLEG): container finished" podID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerID="28a84259682ac8b19ed7f572691d8c2369de14cf6cf51002c97c47560eb5ee72" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.126078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" event={"ID":"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf","Type":"ContainerDied","Data":"28a84259682ac8b19ed7f572691d8c2369de14cf6cf51002c97c47560eb5ee72"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.128775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerStarted","Data":"46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.130072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f5a24d7e-902f-4862-9c6b-8317f8fb3f29","Type":"ContainerStarted","Data":"438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.131154 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerStarted","Data":"a3ba4dde9b3affbf2de80fd01b6004ec5bcc39b41c69eac7056b983bf5ce8c10"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.132103 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerStarted","Data":"9412b68e06c64733faf3ac7751a6a1b8d9727402ab2300755813801ae5bb6cae"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.133097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerStarted","Data":"6218f617d211db14656d09a088c6de02a6677348fa07bdf9d142d99af0111ad7"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.135068 4790 generic.go:334] "Generic (PLEG): container finished" podID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerID="8263c960933930e9418f327d1c70a8da265ccd7214d4f221c7150a432da81ec8" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.135125 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerDied","Data":"8263c960933930e9418f327d1c70a8da265ccd7214d4f221c7150a432da81ec8"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.137609 4790 generic.go:334] "Generic (PLEG): container finished" podID="3603867e-b715-48af-b4d3-248f69035bf4" containerID="49d625d0111656eb749d168f5c6aa08a6533bb845529b49927ec4ee997aab45d" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.137833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerDied","Data":"49d625d0111656eb749d168f5c6aa08a6533bb845529b49927ec4ee997aab45d"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.150732 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5" event={"ID":"c72ac557-7882-4120-b64a-4343639cc766","Type":"ContainerStarted","Data":"873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.211396 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.274928 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7bzr"] Mar 13 20:45:59 crc kubenswrapper[4790]: W0313 20:45:59.308909 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c2d7175_fc2b_4492_ac1c_e2cc3dd44c58.slice/crio-9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168 WatchSource:0}: Error finding container 9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168: Status 404 returned error can't find the container with id 9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.567743 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.596406 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663122 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663524 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663691 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663718 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.668300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb" (OuterVolumeSpecName: "kube-api-access-2d2bb") pod "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" (UID: "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf"). InnerVolumeSpecName "kube-api-access-2d2bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.668647 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2" (OuterVolumeSpecName: "kube-api-access-bb6c2") pod "61d662b4-cdc6-4d2f-a8a6-f71db4380caa" (UID: "61d662b4-cdc6-4d2f-a8a6-f71db4380caa"). InnerVolumeSpecName "kube-api-access-bb6c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.685838 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config" (OuterVolumeSpecName: "config") pod "61d662b4-cdc6-4d2f-a8a6-f71db4380caa" (UID: "61d662b4-cdc6-4d2f-a8a6-f71db4380caa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.692624 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" (UID: "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.705679 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config" (OuterVolumeSpecName: "config") pod "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" (UID: "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.794618 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nrv7g"] Mar 13 20:45:59 crc kubenswrapper[4790]: E0313 20:45:59.795054 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795071 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: E0313 20:45:59.795085 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795093 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795294 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795308 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.796008 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.797957 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-combined-ca-bundle\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovn-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovs-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803274 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdn6h\" (UniqueName: \"kubernetes.io/projected/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-kube-api-access-cdn6h\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-config\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803429 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803445 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803462 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803474 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803488 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.806761 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nrv7g"] Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905144 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-config\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-combined-ca-bundle\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905252 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovn-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.906402 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-config\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.906479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovn-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.908686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovs-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.908735 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdn6h\" (UniqueName: \"kubernetes.io/projected/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-kube-api-access-cdn6h\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.909341 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovs-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.909923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-combined-ca-bundle\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.910603 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.961615 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdn6h\" (UniqueName: \"kubernetes.io/projected/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-kube-api-access-cdn6h\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.057609 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.104779 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.106480 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.109125 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.115913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.115976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.116019 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.116076 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.118320 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.124693 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.160280 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.186752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerStarted","Data":"e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.187100 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.191438 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.192714 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerStarted","Data":"9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.201318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" event={"ID":"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf","Type":"ContainerDied","Data":"1e13e0bcda642d94f6b249dc823d2fd87698f812917e6d7b60f2ffc56fbe460d"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.201387 4790 scope.go:117] "RemoveContainer" containerID="28a84259682ac8b19ed7f572691d8c2369de14cf6cf51002c97c47560eb5ee72" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.201641 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.213338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" event={"ID":"61d662b4-cdc6-4d2f-a8a6-f71db4380caa","Type":"ContainerDied","Data":"efbaf88a68a9782b2a6db13fe0d06640d640fb6b93b4efe14c9fed10e5292a92"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.213547 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216501 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216582 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216653 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.218593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.219013 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.219485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.219609 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.220567 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.221173 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerStarted","Data":"195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.221832 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.226045 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.226068 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.226122 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.234734 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.239693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.271021 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.272421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.278105 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.279852 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.280034 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" podStartSLOduration=3.37372235 podStartE2EDuration="16.280019664s" podCreationTimestamp="2026-03-13 20:45:44 +0000 UTC" firstStartedPulling="2026-03-13 20:45:45.468018889 +0000 UTC m=+1076.489134780" lastFinishedPulling="2026-03-13 20:45:58.374316203 +0000 UTC m=+1089.395432094" observedRunningTime="2026-03-13 20:46:00.212873227 +0000 UTC m=+1091.233989128" watchObservedRunningTime="2026-03-13 20:46:00.280019664 +0000 UTC m=+1091.301135555" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.294652 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" podStartSLOduration=3.59204327 podStartE2EDuration="16.294633661s" podCreationTimestamp="2026-03-13 20:45:44 +0000 UTC" firstStartedPulling="2026-03-13 20:45:45.641568361 +0000 UTC m=+1076.662684252" lastFinishedPulling="2026-03-13 20:45:58.344158752 +0000 UTC m=+1089.365274643" observedRunningTime="2026-03-13 20:46:00.246364838 +0000 UTC m=+1091.267480729" watchObservedRunningTime="2026-03-13 20:46:00.294633661 +0000 UTC m=+1091.315749552" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318810 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"auto-csr-approver-29557246-lrvrv\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.319037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.359549 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.366232 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.379667 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.396475 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419945 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419971 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"auto-csr-approver-29557246-lrvrv\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.420035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.420053 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.420929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.421167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.421676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.421925 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.435787 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"auto-csr-approver-29557246-lrvrv\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.438257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.439869 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.660878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.671448 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:01 crc kubenswrapper[4790]: I0313 20:46:01.232948 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" containerID="cri-o://195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8" gracePeriod=10 Mar 13 20:46:01 crc kubenswrapper[4790]: I0313 20:46:01.679557 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" path="/var/lib/kubelet/pods/61d662b4-cdc6-4d2f-a8a6-f71db4380caa/volumes" Mar 13 20:46:01 crc kubenswrapper[4790]: I0313 20:46:01.680187 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" path="/var/lib/kubelet/pods/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf/volumes" Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.245908 4790 generic.go:334] "Generic (PLEG): container finished" podID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerID="195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8" exitCode=0 Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.245980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerDied","Data":"195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8"} Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.246196 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" containerID="cri-o://e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882" gracePeriod=10 Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.730390 4790 scope.go:117] "RemoveContainer" containerID="d5635f334bb1d0f55f8df6048568c51547f61cdf8fa854744c6f631fac79f9eb" Mar 13 20:46:03 crc kubenswrapper[4790]: I0313 20:46:03.264875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ba4867dc-70fb-4533-a075-31fc03f7ef33","Type":"ContainerStarted","Data":"2dddf44d9b971a802eb68d20dd187687fce5545cfc8a47258dff11cb22b24e69"} Mar 13 20:46:03 crc kubenswrapper[4790]: I0313 20:46:03.267138 4790 generic.go:334] "Generic (PLEG): container finished" podID="3603867e-b715-48af-b4d3-248f69035bf4" containerID="e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882" exitCode=0 Mar 13 20:46:03 crc kubenswrapper[4790]: I0313 20:46:03.267187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerDied","Data":"e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882"} Mar 13 20:46:04 crc kubenswrapper[4790]: I0313 20:46:04.863871 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.022291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"3603867e-b715-48af-b4d3-248f69035bf4\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.022710 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"3603867e-b715-48af-b4d3-248f69035bf4\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.022762 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"3603867e-b715-48af-b4d3-248f69035bf4\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.032362 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4" (OuterVolumeSpecName: "kube-api-access-5zkr4") pod "3603867e-b715-48af-b4d3-248f69035bf4" (UID: "3603867e-b715-48af-b4d3-248f69035bf4"). InnerVolumeSpecName "kube-api-access-5zkr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.056666 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3603867e-b715-48af-b4d3-248f69035bf4" (UID: "3603867e-b715-48af-b4d3-248f69035bf4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.057513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config" (OuterVolumeSpecName: "config") pod "3603867e-b715-48af-b4d3-248f69035bf4" (UID: "3603867e-b715-48af-b4d3-248f69035bf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.124876 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.124915 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.124927 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.291533 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerDied","Data":"d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5"} Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.291618 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.351314 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.358169 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.670742 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3603867e-b715-48af-b4d3-248f69035bf4" path="/var/lib/kubelet/pods/3603867e-b715-48af-b4d3-248f69035bf4/volumes" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.952005 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.041191 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.041280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.041353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.044457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7" (OuterVolumeSpecName: "kube-api-access-v8ck7") pod "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" (UID: "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe"). InnerVolumeSpecName "kube-api-access-v8ck7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.084083 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" (UID: "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.089732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config" (OuterVolumeSpecName: "config") pod "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" (UID: "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.144449 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.144480 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.144489 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.300492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerDied","Data":"137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844"} Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.300557 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.333582 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.340872 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:46:07 crc kubenswrapper[4790]: I0313 20:46:07.670191 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" path="/var/lib/kubelet/pods/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe/volumes" Mar 13 20:46:07 crc kubenswrapper[4790]: I0313 20:46:07.864321 4790 scope.go:117] "RemoveContainer" containerID="e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882" Mar 13 20:46:08 crc kubenswrapper[4790]: I0313 20:46:08.465866 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:46:08 crc kubenswrapper[4790]: I0313 20:46:08.770698 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nrv7g"] Mar 13 20:46:08 crc kubenswrapper[4790]: I0313 20:46:08.956929 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.140690 4790 scope.go:117] "RemoveContainer" containerID="49d625d0111656eb749d168f5c6aa08a6533bb845529b49927ec4ee997aab45d" Mar 13 20:46:09 crc kubenswrapper[4790]: W0313 20:46:09.186027 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb0e0ca_d164_4e22_9d3f_055a45a372d2.slice/crio-361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d WatchSource:0}: Error finding container 361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d: Status 404 returned error can't find the container with id 361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d Mar 13 20:46:09 crc kubenswrapper[4790]: W0313 20:46:09.261406 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5980214a_6a36_4a9b_bb65_1ca2b979d0cc.slice/crio-6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a WatchSource:0}: Error finding container 6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a: Status 404 returned error can't find the container with id 6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.304480 4790 scope.go:117] "RemoveContainer" containerID="195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8" Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.343619 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerStarted","Data":"6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a"} Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.348215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrv7g" event={"ID":"dfb0e0ca-d164-4e22-9d3f-055a45a372d2","Type":"ContainerStarted","Data":"361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d"} Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.349318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" event={"ID":"97e8561a-a685-44f0-986c-1559e5818ba8","Type":"ContainerStarted","Data":"43aee0bab3af6a8bfdda4bb90672879a71cedb5993a623158c3730e25f5f67ba"} Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.500140 4790 scope.go:117] "RemoveContainer" containerID="8263c960933930e9418f327d1c70a8da265ccd7214d4f221c7150a432da81ec8" Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.584231 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:09 crc kubenswrapper[4790]: W0313 20:46:09.823343 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd371e679_2539_4a57_9993_6bd66f0d311e.slice/crio-e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261 WatchSource:0}: Error finding container e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261: Status 404 returned error can't find the container with id e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261 Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.230249 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: i/o timeout" Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.359839 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3980f8da-ddaa-4634-8c09-1a71ae19c58f","Type":"ContainerStarted","Data":"38687d46bd8558e8ff19623ab6b544af687549c5f2646731edfb1896ed86a605"} Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.360522 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.363976 4790 generic.go:334] "Generic (PLEG): container finished" podID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" exitCode=0 Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.364053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerDied","Data":"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6"} Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.365736 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerStarted","Data":"e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261"} Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.408890 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.113558536 podStartE2EDuration="22.408873801s" podCreationTimestamp="2026-03-13 20:45:48 +0000 UTC" firstStartedPulling="2026-03-13 20:45:59.009503521 +0000 UTC m=+1090.030619412" lastFinishedPulling="2026-03-13 20:46:09.304818786 +0000 UTC m=+1100.325934677" observedRunningTime="2026-03-13 20:46:10.379674536 +0000 UTC m=+1101.400790437" watchObservedRunningTime="2026-03-13 20:46:10.408873801 +0000 UTC m=+1101.429989692" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.375272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerStarted","Data":"86e0632ee7d85ec4a092fdd91f4cf4501da9716d6ba3776527053aa4e34f6f82"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.378761 4790 generic.go:334] "Generic (PLEG): container finished" podID="97e8561a-a685-44f0-986c-1559e5818ba8" containerID="a76e1c0d1beff75ffaa42ee8715fd9733a320b575bcb2a1602abbb7840ddf694" exitCode=0 Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.378814 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" event={"ID":"97e8561a-a685-44f0-986c-1559e5818ba8","Type":"ContainerDied","Data":"a76e1c0d1beff75ffaa42ee8715fd9733a320b575bcb2a1602abbb7840ddf694"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.382931 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerStarted","Data":"7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.383622 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.391938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerStarted","Data":"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.415518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerStarted","Data":"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.415628 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.418958 4790 generic.go:334] "Generic (PLEG): container finished" podID="d371e679-2539-4a57-9993-6bd66f0d311e" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" exitCode=0 Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.418984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerDied","Data":"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.420633 4790 generic.go:334] "Generic (PLEG): container finished" podID="8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58" containerID="c7b64ce449f3b79cdcd1395e4a62437aae1930467d8a3439c9aa108a81cbf57c" exitCode=0 Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.420685 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerDied","Data":"c7b64ce449f3b79cdcd1395e4a62437aae1930467d8a3439c9aa108a81cbf57c"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.423678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f5a24d7e-902f-4862-9c6b-8317f8fb3f29","Type":"ContainerStarted","Data":"dbf49fcbf9f4f102f77702250fea555eb9b6bde16734c9db16b78132dfde5910"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.425811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerStarted","Data":"0e42fee0e10bca8be201fe50501a5e82454ac7b9cad70b1bd8bc28c89423c299"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.428328 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ba4867dc-70fb-4533-a075-31fc03f7ef33","Type":"ContainerStarted","Data":"6dd4cb3a76b88d73adca13ca1234e6df84737d567b9b78049d28b310fb4f5f23"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.430298 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerStarted","Data":"a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.433511 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5" event={"ID":"c72ac557-7882-4120-b64a-4343639cc766","Type":"ContainerStarted","Data":"b56543c8608a9d7acf4c66ad8d1e279c901c3ebfb8dd790dfb8aad524883d947"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.433542 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vspq5" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.471774 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.544165709 podStartE2EDuration="20.471755874s" podCreationTimestamp="2026-03-13 20:45:51 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.986122306 +0000 UTC m=+1090.007238197" lastFinishedPulling="2026-03-13 20:46:09.913712471 +0000 UTC m=+1100.934828362" observedRunningTime="2026-03-13 20:46:11.450727882 +0000 UTC m=+1102.471843773" watchObservedRunningTime="2026-03-13 20:46:11.471755874 +0000 UTC m=+1102.492871755" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.513102 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" podStartSLOduration=11.513081049 podStartE2EDuration="11.513081049s" podCreationTimestamp="2026-03-13 20:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:11.50911847 +0000 UTC m=+1102.530234371" watchObservedRunningTime="2026-03-13 20:46:11.513081049 +0000 UTC m=+1102.534196940" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.556779 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vspq5" podStartSLOduration=6.753032632 podStartE2EDuration="17.556760316s" podCreationTimestamp="2026-03-13 20:45:54 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.999747907 +0000 UTC m=+1090.020863798" lastFinishedPulling="2026-03-13 20:46:09.803475591 +0000 UTC m=+1100.824591482" observedRunningTime="2026-03-13 20:46:11.554533866 +0000 UTC m=+1102.575649757" watchObservedRunningTime="2026-03-13 20:46:11.556760316 +0000 UTC m=+1102.577876207" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.454315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" event={"ID":"97e8561a-a685-44f0-986c-1559e5818ba8","Type":"ContainerDied","Data":"43aee0bab3af6a8bfdda4bb90672879a71cedb5993a623158c3730e25f5f67ba"} Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.454803 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43aee0bab3af6a8bfdda4bb90672879a71cedb5993a623158c3730e25f5f67ba" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.516330 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.589183 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"97e8561a-a685-44f0-986c-1559e5818ba8\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.594674 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9" (OuterVolumeSpecName: "kube-api-access-mtpm9") pod "97e8561a-a685-44f0-986c-1559e5818ba8" (UID: "97e8561a-a685-44f0-986c-1559e5818ba8"). InnerVolumeSpecName "kube-api-access-mtpm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.692037 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.465515 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerStarted","Data":"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.465934 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.468353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ba4867dc-70fb-4533-a075-31fc03f7ef33","Type":"ContainerStarted","Data":"70a4726ebbb5eeec16130e19d4e9f480c13925eada3756ca2318b730eb7fce0e"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.469889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrv7g" event={"ID":"dfb0e0ca-d164-4e22-9d3f-055a45a372d2","Type":"ContainerStarted","Data":"adc15f321ac61e0a2850691c665b578baab9c591f9b64c5be8f302eda5223247"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerStarted","Data":"0c13caebc66cb1cb2f2fc42dc4ef55e548b61bf3abdcdbc2c1d4701e7157becb"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472158 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerStarted","Data":"54e2add76348c89f728c121ee9ca9e2012e5b1c658d6628cf3428d30665141c8"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472205 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472232 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.473971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f5a24d7e-902f-4862-9c6b-8317f8fb3f29","Type":"ContainerStarted","Data":"a44b1c232e61c6d1d71c2c09572fa8266651ab19b81937b993ea6c9db5e2c25a"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.475435 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa2face0-9349-4482-880a-b23cf41099b2" containerID="0e42fee0e10bca8be201fe50501a5e82454ac7b9cad70b1bd8bc28c89423c299" exitCode=0 Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.475516 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerDied","Data":"0e42fee0e10bca8be201fe50501a5e82454ac7b9cad70b1bd8bc28c89423c299"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.475737 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.510543 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" podStartSLOduration=15.510524869 podStartE2EDuration="15.510524869s" podCreationTimestamp="2026-03-13 20:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:14.487715818 +0000 UTC m=+1105.508831719" watchObservedRunningTime="2026-03-13 20:46:14.510524869 +0000 UTC m=+1105.531640760" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.516232 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.870753563 podStartE2EDuration="17.516211113s" podCreationTimestamp="2026-03-13 20:45:57 +0000 UTC" firstStartedPulling="2026-03-13 20:46:02.751107935 +0000 UTC m=+1093.772223826" lastFinishedPulling="2026-03-13 20:46:13.396565485 +0000 UTC m=+1104.417681376" observedRunningTime="2026-03-13 20:46:14.50582138 +0000 UTC m=+1105.526937271" watchObservedRunningTime="2026-03-13 20:46:14.516211113 +0000 UTC m=+1105.537327004" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.537005 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.237340293 podStartE2EDuration="21.536984218s" podCreationTimestamp="2026-03-13 20:45:53 +0000 UTC" firstStartedPulling="2026-03-13 20:45:59.081210082 +0000 UTC m=+1090.102325973" lastFinishedPulling="2026-03-13 20:46:13.380854007 +0000 UTC m=+1104.401969898" observedRunningTime="2026-03-13 20:46:14.532516847 +0000 UTC m=+1105.553632758" watchObservedRunningTime="2026-03-13 20:46:14.536984218 +0000 UTC m=+1105.558100109" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.574039 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-k7bzr" podStartSLOduration=10.580376808 podStartE2EDuration="20.574018376s" podCreationTimestamp="2026-03-13 20:45:54 +0000 UTC" firstStartedPulling="2026-03-13 20:45:59.310799458 +0000 UTC m=+1090.331915349" lastFinishedPulling="2026-03-13 20:46:09.304441026 +0000 UTC m=+1100.325556917" observedRunningTime="2026-03-13 20:46:14.566634315 +0000 UTC m=+1105.587750206" watchObservedRunningTime="2026-03-13 20:46:14.574018376 +0000 UTC m=+1105.595134267" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.610071 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.616233 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nrv7g" podStartSLOduration=11.425832602 podStartE2EDuration="15.616215603s" podCreationTimestamp="2026-03-13 20:45:59 +0000 UTC" firstStartedPulling="2026-03-13 20:46:09.190701243 +0000 UTC m=+1100.211817134" lastFinishedPulling="2026-03-13 20:46:13.381084244 +0000 UTC m=+1104.402200135" observedRunningTime="2026-03-13 20:46:14.583187385 +0000 UTC m=+1105.604303276" watchObservedRunningTime="2026-03-13 20:46:14.616215603 +0000 UTC m=+1105.637331494" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.617451 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.861244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.485680 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerStarted","Data":"ecd707787ab07582c744789073f71c6681c500cf47b72baeab75726ed26695eb"} Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.488005 4790 generic.go:334] "Generic (PLEG): container finished" podID="fceb0829-5f0e-4e78-a803-61afc5aa4d60" containerID="86e0632ee7d85ec4a092fdd91f4cf4501da9716d6ba3776527053aa4e34f6f82" exitCode=0 Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.488146 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerDied","Data":"86e0632ee7d85ec4a092fdd91f4cf4501da9716d6ba3776527053aa4e34f6f82"} Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.514569 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.469506821 podStartE2EDuration="28.514541481s" podCreationTimestamp="2026-03-13 20:45:47 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.229924745 +0000 UTC m=+1089.251040636" lastFinishedPulling="2026-03-13 20:46:09.274959405 +0000 UTC m=+1100.296075296" observedRunningTime="2026-03-13 20:46:15.505419983 +0000 UTC m=+1106.526535884" watchObservedRunningTime="2026-03-13 20:46:15.514541481 +0000 UTC m=+1106.535657382" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.670970 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" path="/var/lib/kubelet/pods/f6f1fa3a-7f88-4e89-bd00-4426798fccce/volumes" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.675539 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.739824 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.861695 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.906855 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.498658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerStarted","Data":"7367436a9fe0a765fc7d324d2f702d2ba65dbe1ea3313bfa3b02a185aee63c92"} Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.498999 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" containerID="cri-o://b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" gracePeriod=10 Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.531586 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.411194354 podStartE2EDuration="30.531566487s" podCreationTimestamp="2026-03-13 20:45:46 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.875147186 +0000 UTC m=+1089.896263077" lastFinishedPulling="2026-03-13 20:46:07.995519309 +0000 UTC m=+1099.016635210" observedRunningTime="2026-03-13 20:46:16.522426628 +0000 UTC m=+1107.543542559" watchObservedRunningTime="2026-03-13 20:46:16.531566487 +0000 UTC m=+1107.552682388" Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.547967 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.960794 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059593 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059763 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.065836 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4" (OuterVolumeSpecName: "kube-api-access-f68w4") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "kube-api-access-f68w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.100392 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config" (OuterVolumeSpecName: "config") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.100411 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.101876 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161043 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161071 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161083 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161092 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.212675 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.258839 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510260 4790 generic.go:334] "Generic (PLEG): container finished" podID="d371e679-2539-4a57-9993-6bd66f0d311e" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" exitCode=0 Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerDied","Data":"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb"} Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510367 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510417 4790 scope.go:117] "RemoveContainer" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerDied","Data":"e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261"} Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510754 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.534436 4790 scope.go:117] "RemoveContainer" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.544255 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.550633 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.551214 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.577736 4790 scope.go:117] "RemoveContainer" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.579778 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb\": container with ID starting with b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb not found: ID does not exist" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.579822 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb"} err="failed to get container status \"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb\": rpc error: code = NotFound desc = could not find container \"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb\": container with ID starting with b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb not found: ID does not exist" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.579849 4790 scope.go:117] "RemoveContainer" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.579961 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.581353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.581510 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03\": container with ID starting with 35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03 not found: ID does not exist" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.581549 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03"} err="failed to get container status \"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03\": rpc error: code = NotFound desc = could not find container \"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03\": container with ID starting with 35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03 not found: ID does not exist" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.671820 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" path="/var/lib/kubelet/pods/d371e679-2539-4a57-9993-6bd66f0d311e/volumes" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.702989 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703314 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703340 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703346 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703362 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703369 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703408 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" containerName="oc" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703419 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" containerName="oc" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.705700 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.705731 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.705766 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.705774 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.705798 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.705806 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706072 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" containerName="oc" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706084 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706094 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706111 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706911 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.718910 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.719238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-96kb4" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.719879 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.720028 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.721341 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4knsx\" (UniqueName: \"kubernetes.io/projected/18e18c94-0ce6-4578-a224-384826512a34-kube-api-access-4knsx\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872390 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-config\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18e18c94-0ce6-4578-a224-384826512a34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-scripts\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.974937 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18e18c94-0ce6-4578-a224-384826512a34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-scripts\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4knsx\" (UniqueName: \"kubernetes.io/projected/18e18c94-0ce6-4578-a224-384826512a34-kube-api-access-4knsx\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-config\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18e18c94-0ce6-4578-a224-384826512a34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.976259 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-scripts\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.976264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-config\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.979643 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.979874 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.986087 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.993008 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4knsx\" (UniqueName: \"kubernetes.io/projected/18e18c94-0ce6-4578-a224-384826512a34-kube-api-access-4knsx\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.029436 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 20:46:18 crc kubenswrapper[4790]: W0313 20:46:18.450886 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e18c94_0ce6_4578_a224_384826512a34.slice/crio-c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719 WatchSource:0}: Error finding container c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719: Status 404 returned error can't find the container with id c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719 Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.456796 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.519526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18e18c94-0ce6-4578-a224-384826512a34","Type":"ContainerStarted","Data":"c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719"} Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.939039 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.939112 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:19 crc kubenswrapper[4790]: I0313 20:46:19.248223 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 20:46:19 crc kubenswrapper[4790]: E0313 20:46:19.501869 4790 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.143:36798->38.102.83.143:39163: write tcp 38.102.83.143:36798->38.102.83.143:39163: write: broken pipe Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.537109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18e18c94-0ce6-4578-a224-384826512a34","Type":"ContainerStarted","Data":"0bdeecb17f1f462f15f7514a6de1d42f2a2888bfdb2c48b2e4f3fa9a499f4076"} Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.537449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18e18c94-0ce6-4578-a224-384826512a34","Type":"ContainerStarted","Data":"195304f3dcc55f6eff0508aa06cf89a02ba45b95a698d078d2a77be9a2a267ba"} Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.537465 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.561627 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.509577958 podStartE2EDuration="3.561590037s" podCreationTimestamp="2026-03-13 20:46:17 +0000 UTC" firstStartedPulling="2026-03-13 20:46:18.453133759 +0000 UTC m=+1109.474249650" lastFinishedPulling="2026-03-13 20:46:19.505145838 +0000 UTC m=+1110.526261729" observedRunningTime="2026-03-13 20:46:20.55617708 +0000 UTC m=+1111.577292971" watchObservedRunningTime="2026-03-13 20:46:20.561590037 +0000 UTC m=+1111.582705978" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.352829 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.354228 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.365367 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.415648 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433948 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.531843 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.534929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.534981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.535043 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.535069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.535117 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.536268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.536575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.538026 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.539285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.554924 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.616491 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.676929 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:22 crc kubenswrapper[4790]: W0313 20:46:22.117138 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd798b6d8_8c2b_4827_81d3_09177054591f.slice/crio-c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3 WatchSource:0}: Error finding container c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3: Status 404 returned error can't find the container with id c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3 Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.126919 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.483844 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.490086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.492084 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.492574 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-ngh5j" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.492626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.493356 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.504206 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.551203 4790 generic.go:334] "Generic (PLEG): container finished" podID="d798b6d8-8c2b-4827-81d3-09177054591f" containerID="978e68813566a9c04dd155a064a373e2649857a2eefbe05ca9b8949d3e9db280" exitCode=0 Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.551244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerDied","Data":"978e68813566a9c04dd155a064a373e2649857a2eefbe05ca9b8949d3e9db280"} Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.551268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerStarted","Data":"c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3"} Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-lock\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552166 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552247 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529b41ec-f1ee-432c-ac41-6957e1809aaa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-cache\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cr9q\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-kube-api-access-6cr9q\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.653585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529b41ec-f1ee-432c-ac41-6957e1809aaa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.653841 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-cache\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.653927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cr9q\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-kube-api-access-6cr9q\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.654061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.654174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-lock\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.654272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: E0313 20:46:22.654467 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:22 crc kubenswrapper[4790]: E0313 20:46:22.654533 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:22 crc kubenswrapper[4790]: E0313 20:46:22.654642 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:23.154622474 +0000 UTC m=+1114.175738365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.656006 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.657312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-cache\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.657414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-lock\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.660011 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529b41ec-f1ee-432c-ac41-6957e1809aaa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.676206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cr9q\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-kube-api-access-6cr9q\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.687652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.099414 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dv686"] Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.100768 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.102409 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.102773 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.105920 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.113629 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dv686"] Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164092 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164145 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164358 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164573 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:23 crc kubenswrapper[4790]: E0313 20:46:23.164789 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:23 crc kubenswrapper[4790]: E0313 20:46:23.164815 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:23 crc kubenswrapper[4790]: E0313 20:46:23.164885 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:24.164861564 +0000 UTC m=+1115.185977455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.265849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.266042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.266142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.267877 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268266 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.269100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.269222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.269901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.270972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.271030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.286538 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.416681 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.575933 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerStarted","Data":"e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb"} Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.576319 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.604462 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gv56q" podStartSLOduration=2.604441642 podStartE2EDuration="2.604441642s" podCreationTimestamp="2026-03-13 20:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:23.604037391 +0000 UTC m=+1114.625153292" watchObservedRunningTime="2026-03-13 20:46:23.604441642 +0000 UTC m=+1114.625557533" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.699104 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.762398 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.877278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dv686"] Mar 13 20:46:24 crc kubenswrapper[4790]: I0313 20:46:24.181942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:24 crc kubenswrapper[4790]: E0313 20:46:24.182103 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:24 crc kubenswrapper[4790]: E0313 20:46:24.182124 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:24 crc kubenswrapper[4790]: E0313 20:46:24.182185 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:26.182165528 +0000 UTC m=+1117.203281419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:24 crc kubenswrapper[4790]: I0313 20:46:24.585172 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerStarted","Data":"e3681864143fdf49c4108aa2fae3bb58046cc42f144960f544964a65dc7f5591"} Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.217405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:26 crc kubenswrapper[4790]: E0313 20:46:26.217825 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:26 crc kubenswrapper[4790]: E0313 20:46:26.217924 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:26 crc kubenswrapper[4790]: E0313 20:46:26.218002 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:30.217983819 +0000 UTC m=+1121.239099710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.371768 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.372706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.380426 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.387934 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.521498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.521825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.623012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.623138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.623917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.642463 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.705123 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:27 crc kubenswrapper[4790]: I0313 20:46:27.977969 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:27 crc kubenswrapper[4790]: W0313 20:46:27.983653 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d3dd8de_0de0_4703_a067_446d2822860d.slice/crio-489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c WatchSource:0}: Error finding container 489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c: Status 404 returned error can't find the container with id 489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.615673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerStarted","Data":"0b6241fd3bfe8fbe3b943719b842facbdec444bbd9bc9d23531d0137fa8a476f"} Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.620142 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d3dd8de-0de0-4703-a067-446d2822860d" containerID="a469cae8d28a17763807dc70d5fbc5f435ef49995e55c306927cfc053eea835d" exitCode=0 Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.620185 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4fjc" event={"ID":"5d3dd8de-0de0-4703-a067-446d2822860d","Type":"ContainerDied","Data":"a469cae8d28a17763807dc70d5fbc5f435ef49995e55c306927cfc053eea835d"} Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.620208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4fjc" event={"ID":"5d3dd8de-0de0-4703-a067-446d2822860d","Type":"ContainerStarted","Data":"489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c"} Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.641943 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dv686" podStartSLOduration=1.770172504 podStartE2EDuration="5.641922588s" podCreationTimestamp="2026-03-13 20:46:23 +0000 UTC" firstStartedPulling="2026-03-13 20:46:23.882605899 +0000 UTC m=+1114.903721790" lastFinishedPulling="2026-03-13 20:46:27.754355983 +0000 UTC m=+1118.775471874" observedRunningTime="2026-03-13 20:46:28.63793745 +0000 UTC m=+1119.659053351" watchObservedRunningTime="2026-03-13 20:46:28.641922588 +0000 UTC m=+1119.663038479" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.062797 4790 scope.go:117] "RemoveContainer" containerID="a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.380932 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.382815 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.390742 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.475722 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.477242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.479330 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.484657 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.576811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.577329 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.678829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.679505 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.679724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.679984 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.680846 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.700353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.712082 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.783224 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.783365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.784102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.848494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.996938 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.090773 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.193143 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"5d3dd8de-0de0-4703-a067-446d2822860d\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.193170 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"5d3dd8de-0de0-4703-a067-446d2822860d\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.194292 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d3dd8de-0de0-4703-a067-446d2822860d" (UID: "5d3dd8de-0de0-4703-a067-446d2822860d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.200706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4" (OuterVolumeSpecName: "kube-api-access-8d7l4") pod "5d3dd8de-0de0-4703-a067-446d2822860d" (UID: "5d3dd8de-0de0-4703-a067-446d2822860d"). InnerVolumeSpecName "kube-api-access-8d7l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.201837 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:46:30 crc kubenswrapper[4790]: W0313 20:46:30.208863 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bfc00cf_9a76_4b6f_a8f5_315af824814d.slice/crio-ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03 WatchSource:0}: Error finding container ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03: Status 404 returned error can't find the container with id ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03 Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.278993 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.279655 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" containerName="mariadb-account-create-update" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.279695 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" containerName="mariadb-account-create-update" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.279946 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" containerName="mariadb-account-create-update" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.280749 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.294978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295293 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295310 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.295370 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.295416 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.295484 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:38.295449118 +0000 UTC m=+1129.316565009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.298037 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.392394 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.395058 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396412 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396579 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.397102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.397626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.400732 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.421132 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.494278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.498214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.498263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.498935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.514400 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.579832 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.581305 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.599909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.599975 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.608460 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.609431 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.613806 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.616479 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.626899 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.642653 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerStarted","Data":"3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.642770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerStarted","Data":"ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.649034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4fjc" event={"ID":"5d3dd8de-0de0-4703-a067-446d2822860d","Type":"ContainerDied","Data":"489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.649069 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.649044 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.652186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc9a-account-create-update-7s4hb" event={"ID":"1b5f7e2a-401c-4a9f-9222-5037f9d1d499","Type":"ContainerStarted","Data":"8d620e2ac11015f6738329101767fa1a633c874417fba59eff69e65d3e55a8a1"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.678448 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-qflsz" podStartSLOduration=1.678433337 podStartE2EDuration="1.678433337s" podCreationTimestamp="2026-03-13 20:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:30.661972119 +0000 UTC m=+1121.683088010" watchObservedRunningTime="2026-03-13 20:46:30.678433337 +0000 UTC m=+1121.699549228" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.681721 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bc9a-account-create-update-7s4hb" podStartSLOduration=1.681713646 podStartE2EDuration="1.681713646s" podCreationTimestamp="2026-03-13 20:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:30.675996421 +0000 UTC m=+1121.697112302" watchObservedRunningTime="2026-03-13 20:46:30.681713646 +0000 UTC m=+1121.702829537" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.684882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.702692 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703074 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703936 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.723119 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.732756 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.805129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.805499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.806224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.828493 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.907022 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.929491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.115343 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.124704 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20a3a1cb_c500_4355_ae67_649e381b1b88.slice/crio-bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d WatchSource:0}: Error finding container bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d: Status 404 returned error can't find the container with id bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.240151 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.247405 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a7e4224_0922_4f9a_af94_0a9933f27530.slice/crio-b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c WatchSource:0}: Error finding container b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c: Status 404 returned error can't find the container with id b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.372730 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.373708 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a6f76f_d9d1_4ab9_ac4c_e483e55926a0.slice/crio-9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411 WatchSource:0}: Error finding container 9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411: Status 404 returned error can't find the container with id 9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.455424 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.487317 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b6f7fe9_fb1f_430c_80e5_0dbe98da2b9c.slice/crio-8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1 WatchSource:0}: Error finding container 8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1: Status 404 returned error can't find the container with id 8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.666070 4790 generic.go:334] "Generic (PLEG): container finished" podID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerID="3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.669167 4790 generic.go:334] "Generic (PLEG): container finished" podID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerID="eaedee9332ceb5ac2c43fa820fcea3e6086d5dfda3317381786c3cc819576b44" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.671325 4790 generic.go:334] "Generic (PLEG): container finished" podID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerID="e50d6c82675c18c36b9041dc6a13dffb21bb7a9c1cb73ee61c06ce0d61f0b9b3" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673163 4790 generic.go:334] "Generic (PLEG): container finished" podID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerID="ff9f56b80e2e388086557f7fc707002adf2609bc96cff97367abf262894bf61f" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerStarted","Data":"2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerStarted","Data":"8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerDied","Data":"3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673902 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dtps4" event={"ID":"20a3a1cb-c500-4355-ae67-649e381b1b88","Type":"ContainerDied","Data":"eaedee9332ceb5ac2c43fa820fcea3e6086d5dfda3317381786c3cc819576b44"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dtps4" event={"ID":"20a3a1cb-c500-4355-ae67-649e381b1b88","Type":"ContainerStarted","Data":"bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6245-account-create-update-5tjxd" event={"ID":"2a7e4224-0922-4f9a-af94-0a9933f27530","Type":"ContainerDied","Data":"e50d6c82675c18c36b9041dc6a13dffb21bb7a9c1cb73ee61c06ce0d61f0b9b3"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673946 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6245-account-create-update-5tjxd" event={"ID":"2a7e4224-0922-4f9a-af94-0a9933f27530","Type":"ContainerStarted","Data":"b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc9a-account-create-update-7s4hb" event={"ID":"1b5f7e2a-401c-4a9f-9222-5037f9d1d499","Type":"ContainerDied","Data":"ff9f56b80e2e388086557f7fc707002adf2609bc96cff97367abf262894bf61f"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.674796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerStarted","Data":"5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.674828 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerStarted","Data":"9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.686540 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.701644 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76eb-account-create-update-fsrb9" podStartSLOduration=1.701619511 podStartE2EDuration="1.701619511s" podCreationTimestamp="2026-03-13 20:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:31.68430346 +0000 UTC m=+1122.705419351" watchObservedRunningTime="2026-03-13 20:46:31.701619511 +0000 UTC m=+1122.722735412" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.708560 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-swgpr" podStartSLOduration=1.708537519 podStartE2EDuration="1.708537519s" podCreationTimestamp="2026-03-13 20:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:31.69570904 +0000 UTC m=+1122.716824941" watchObservedRunningTime="2026-03-13 20:46:31.708537519 +0000 UTC m=+1122.729653430" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.823677 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.824326 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" containerID="cri-o://e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" gracePeriod=10 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.369322 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437302 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437362 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437400 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.456561 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k" (OuterVolumeSpecName: "kube-api-access-hcv5k") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "kube-api-access-hcv5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.490343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.505415 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.511828 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config" (OuterVolumeSpecName: "config") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.525493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539513 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539551 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539567 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539577 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539589 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.573372 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.580190 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.684735 4790 generic.go:334] "Generic (PLEG): container finished" podID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" exitCode=0 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.684833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerDied","Data":"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.685050 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerDied","Data":"6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.684850 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.685072 4790 scope.go:117] "RemoveContainer" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.686727 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerID="5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498" exitCode=0 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.686817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerDied","Data":"5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.689218 4790 generic.go:334] "Generic (PLEG): container finished" podID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerID="2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368" exitCode=0 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.689479 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerDied","Data":"2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.715141 4790 scope.go:117] "RemoveContainer" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.745288 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.750457 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.785323 4790 scope.go:117] "RemoveContainer" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" Mar 13 20:46:32 crc kubenswrapper[4790]: E0313 20:46:32.790155 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c\": container with ID starting with e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c not found: ID does not exist" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.790193 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c"} err="failed to get container status \"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c\": rpc error: code = NotFound desc = could not find container \"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c\": container with ID starting with e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c not found: ID does not exist" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.790215 4790 scope.go:117] "RemoveContainer" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" Mar 13 20:46:32 crc kubenswrapper[4790]: E0313 20:46:32.792427 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6\": container with ID starting with d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6 not found: ID does not exist" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.792475 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6"} err="failed to get container status \"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6\": rpc error: code = NotFound desc = could not find container \"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6\": container with ID starting with d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6 not found: ID does not exist" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.023412 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.157844 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.157937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.159058 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b5f7e2a-401c-4a9f-9222-5037f9d1d499" (UID: "1b5f7e2a-401c-4a9f-9222-5037f9d1d499"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.167625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh" (OuterVolumeSpecName: "kube-api-access-t9zjh") pod "1b5f7e2a-401c-4a9f-9222-5037f9d1d499" (UID: "1b5f7e2a-401c-4a9f-9222-5037f9d1d499"). InnerVolumeSpecName "kube-api-access-t9zjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.237273 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.244142 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.260702 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.260741 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.287305 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361238 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"2a7e4224-0922-4f9a-af94-0a9933f27530\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"20a3a1cb-c500-4355-ae67-649e381b1b88\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361461 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"2a7e4224-0922-4f9a-af94-0a9933f27530\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"20a3a1cb-c500-4355-ae67-649e381b1b88\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.362142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a7e4224-0922-4f9a-af94-0a9933f27530" (UID: "2a7e4224-0922-4f9a-af94-0a9933f27530"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.362197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20a3a1cb-c500-4355-ae67-649e381b1b88" (UID: "20a3a1cb-c500-4355-ae67-649e381b1b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.364639 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw" (OuterVolumeSpecName: "kube-api-access-rrddw") pod "20a3a1cb-c500-4355-ae67-649e381b1b88" (UID: "20a3a1cb-c500-4355-ae67-649e381b1b88"). InnerVolumeSpecName "kube-api-access-rrddw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.364770 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq" (OuterVolumeSpecName: "kube-api-access-sfgwq") pod "2a7e4224-0922-4f9a-af94-0a9933f27530" (UID: "2a7e4224-0922-4f9a-af94-0a9933f27530"). InnerVolumeSpecName "kube-api-access-sfgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.462332 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.462620 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.462845 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9bfc00cf-9a76-4b6f-a8f5-315af824814d" (UID: "9bfc00cf-9a76-4b6f-a8f5-315af824814d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463025 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463042 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463055 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463068 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463080 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.465534 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn" (OuterVolumeSpecName: "kube-api-access-k76nn") pod "9bfc00cf-9a76-4b6f-a8f5-315af824814d" (UID: "9bfc00cf-9a76-4b6f-a8f5-315af824814d"). InnerVolumeSpecName "kube-api-access-k76nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.564624 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.671955 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" path="/var/lib/kubelet/pods/5980214a-6a36-4a9b-bb65-1ca2b979d0cc/volumes" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.673726 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" path="/var/lib/kubelet/pods/5d3dd8de-0de0-4703-a067-446d2822860d/volumes" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.710746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc9a-account-create-update-7s4hb" event={"ID":"1b5f7e2a-401c-4a9f-9222-5037f9d1d499","Type":"ContainerDied","Data":"8d620e2ac11015f6738329101767fa1a633c874417fba59eff69e65d3e55a8a1"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.710787 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d620e2ac11015f6738329101767fa1a633c874417fba59eff69e65d3e55a8a1" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.710845 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.714082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerDied","Data":"ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.714123 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.714171 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.716715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dtps4" event={"ID":"20a3a1cb-c500-4355-ae67-649e381b1b88","Type":"ContainerDied","Data":"bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.716735 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.716743 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.720688 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6245-account-create-update-5tjxd" event={"ID":"2a7e4224-0922-4f9a-af94-0a9933f27530","Type":"ContainerDied","Data":"b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.720723 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.720818 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.128457 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.134045 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277057 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277551 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277609 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277714 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.278167 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" (UID: "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.278512 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" (UID: "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.284524 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp" (OuterVolumeSpecName: "kube-api-access-jtlnp") pod "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" (UID: "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c"). InnerVolumeSpecName "kube-api-access-jtlnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.287076 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr" (OuterVolumeSpecName: "kube-api-access-9ddwr") pod "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" (UID: "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0"). InnerVolumeSpecName "kube-api-access-9ddwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379498 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379542 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379554 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379567 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607109 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607477 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607497 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607507 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607515 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607525 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607534 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607549 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607558 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607571 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607578 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607593 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="init" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607600 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="init" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607610 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607616 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607628 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607634 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607792 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607805 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607818 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607828 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607838 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607849 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607860 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.608337 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.610339 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.610787 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dwzcz" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.621684 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.728671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerDied","Data":"9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411"} Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.728708 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.729069 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.730145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerDied","Data":"8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1"} Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.730179 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.730204 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785833 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887368 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887476 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.892234 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.892640 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.896267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.907009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.984885 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:35 crc kubenswrapper[4790]: W0313 20:46:35.510826 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda93720f0_c882_49d8_bd56_7d77237da6e7.slice/crio-e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8 WatchSource:0}: Error finding container e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8: Status 404 returned error can't find the container with id e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8 Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.510926 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.754027 4790 generic.go:334] "Generic (PLEG): container finished" podID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerID="0b6241fd3bfe8fbe3b943719b842facbdec444bbd9bc9d23531d0137fa8a476f" exitCode=0 Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.754124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerDied","Data":"0b6241fd3bfe8fbe3b943719b842facbdec444bbd9bc9d23531d0137fa8a476f"} Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.757806 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerStarted","Data":"e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8"} Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.101823 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225104 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225172 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225741 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225797 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226226 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226346 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.227356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.227601 4790 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.227774 4790 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.230799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq" (OuterVolumeSpecName: "kube-api-access-chgfq") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "kube-api-access-chgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.245498 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts" (OuterVolumeSpecName: "scripts") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.246090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.250361 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.252459 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329790 4790 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329828 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329842 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329854 4790 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329865 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.586132 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:46:37 crc kubenswrapper[4790]: E0313 20:46:37.586737 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerName="swift-ring-rebalance" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.586781 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerName="swift-ring-rebalance" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.587027 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerName="swift-ring-rebalance" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.587607 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.590996 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.593197 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.736017 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.736402 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.774913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerDied","Data":"e3681864143fdf49c4108aa2fae3bb58046cc42f144960f544964a65dc7f5591"} Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.774982 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.774988 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3681864143fdf49c4108aa2fae3bb58046cc42f144960f544964a65dc7f5591" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.837591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.837719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.838477 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.855051 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.909168 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.114611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.345560 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.352249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.381514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:46:38 crc kubenswrapper[4790]: W0313 20:46:38.384574 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfa975ed_d42b_43be_91a1_4a2288005883.slice/crio-3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099 WatchSource:0}: Error finding container 3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099: Status 404 returned error can't find the container with id 3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099 Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.405462 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.784157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerStarted","Data":"3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099"} Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.957530 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 20:46:38 crc kubenswrapper[4790]: W0313 20:46:38.965113 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod529b41ec_f1ee_432c_ac41_6957e1809aaa.slice/crio-9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021 WatchSource:0}: Error finding container 9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021: Status 404 returned error can't find the container with id 9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021 Mar 13 20:46:39 crc kubenswrapper[4790]: I0313 20:46:39.793241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021"} Mar 13 20:46:40 crc kubenswrapper[4790]: I0313 20:46:40.803966 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerStarted","Data":"caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba"} Mar 13 20:46:40 crc kubenswrapper[4790]: I0313 20:46:40.822808 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-fjjbp" podStartSLOduration=3.822785085 podStartE2EDuration="3.822785085s" podCreationTimestamp="2026-03-13 20:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:40.817282715 +0000 UTC m=+1131.838398606" watchObservedRunningTime="2026-03-13 20:46:40.822785085 +0000 UTC m=+1131.843900976" Mar 13 20:46:41 crc kubenswrapper[4790]: I0313 20:46:41.814344 4790 generic.go:334] "Generic (PLEG): container finished" podID="cfa975ed-d42b-43be-91a1-4a2288005883" containerID="caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba" exitCode=0 Mar 13 20:46:41 crc kubenswrapper[4790]: I0313 20:46:41.814415 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerDied","Data":"caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba"} Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.831258 4790 generic.go:334] "Generic (PLEG): container finished" podID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" exitCode=0 Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.831373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerDied","Data":"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde"} Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.834236 4790 generic.go:334] "Generic (PLEG): container finished" podID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerID="a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040" exitCode=0 Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.834277 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerDied","Data":"a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040"} Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.593285 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vspq5" podUID="c72ac557-7882-4120-b64a-4343639cc766" containerName="ovn-controller" probeResult="failure" output=< Mar 13 20:46:44 crc kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 20:46:44 crc kubenswrapper[4790]: > Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.685134 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.686372 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.908202 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.909759 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.911841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.936043 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066698 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066841 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066947 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.168580 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.168868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.168932 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169367 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.171636 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.187621 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.231657 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.168120 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.322841 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"cfa975ed-d42b-43be-91a1-4a2288005883\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.323030 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"cfa975ed-d42b-43be-91a1-4a2288005883\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.323832 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfa975ed-d42b-43be-91a1-4a2288005883" (UID: "cfa975ed-d42b-43be-91a1-4a2288005883"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.326960 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6" (OuterVolumeSpecName: "kube-api-access-jhbt6") pod "cfa975ed-d42b-43be-91a1-4a2288005883" (UID: "cfa975ed-d42b-43be-91a1-4a2288005883"). InnerVolumeSpecName "kube-api-access-jhbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.425113 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.425150 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.897487 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.897486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerDied","Data":"3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.898269 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.903966 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:48 crc kubenswrapper[4790]: W0313 20:46:48.914969 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40a4a31a_66d7_491c_bb0a_e2ec83fc928c.slice/crio-1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb WatchSource:0}: Error finding container 1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb: Status 404 returned error can't find the container with id 1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.917518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"bdc973933e06ab9bcb50133f253fc70761af872f58b342af7472b9f35c74b873"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.917562 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"46594b4458d0348ffb2e7b5e3a1c9038df2cfbe6502cd7cc99631f58654fcca1"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.921405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerStarted","Data":"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.921624 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.926920 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerStarted","Data":"9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.927142 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.950242 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.832492007 podStartE2EDuration="1m3.950223587s" podCreationTimestamp="2026-03-13 20:45:45 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.876068302 +0000 UTC m=+1089.897184183" lastFinishedPulling="2026-03-13 20:46:07.993799872 +0000 UTC m=+1099.014915763" observedRunningTime="2026-03-13 20:46:48.944013978 +0000 UTC m=+1139.965129879" watchObservedRunningTime="2026-03-13 20:46:48.950223587 +0000 UTC m=+1139.971339478" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.980818 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=54.447017916 podStartE2EDuration="1m4.980796479s" podCreationTimestamp="2026-03-13 20:45:44 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.906120459 +0000 UTC m=+1089.927236350" lastFinishedPulling="2026-03-13 20:46:09.439899022 +0000 UTC m=+1100.461014913" observedRunningTime="2026-03-13 20:46:48.971555827 +0000 UTC m=+1139.992671718" watchObservedRunningTime="2026-03-13 20:46:48.980796479 +0000 UTC m=+1140.001912370" Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.611539 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vspq5" Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.936354 4790 generic.go:334] "Generic (PLEG): container finished" podID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerID="08d59d9ecbc8376b9de39bc3a93a8ca2a0b84d09598e5daa63ce7fe053fdaadf" exitCode=0 Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.936720 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5-config-bhht7" event={"ID":"40a4a31a-66d7-491c-bb0a-e2ec83fc928c","Type":"ContainerDied","Data":"08d59d9ecbc8376b9de39bc3a93a8ca2a0b84d09598e5daa63ce7fe053fdaadf"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.936751 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5-config-bhht7" event={"ID":"40a4a31a-66d7-491c-bb0a-e2ec83fc928c","Type":"ContainerStarted","Data":"1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.938674 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerStarted","Data":"7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.941309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"6365d7f281748fda6e37e1322ae1ca8277b8843c96ec9e7cceef6dfbf552801d"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.941368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"a0d6a9f8a9598434b5c6c515c3cac18ae26d99e999a72ce20259db06fffa438e"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.973531 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pshzp" podStartSLOduration=3.014532408 podStartE2EDuration="15.973513984s" podCreationTimestamp="2026-03-13 20:46:34 +0000 UTC" firstStartedPulling="2026-03-13 20:46:35.51366368 +0000 UTC m=+1126.534779571" lastFinishedPulling="2026-03-13 20:46:48.472645256 +0000 UTC m=+1139.493761147" observedRunningTime="2026-03-13 20:46:49.968334513 +0000 UTC m=+1140.989450404" watchObservedRunningTime="2026-03-13 20:46:49.973513984 +0000 UTC m=+1140.994629875" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.835804 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.881799 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.881964 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882000 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882026 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882058 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883241 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883284 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883474 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run" (OuterVolumeSpecName: "var-run") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts" (OuterVolumeSpecName: "scripts") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.884347 4790 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.884468 4790 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.884541 4790 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.900165 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb" (OuterVolumeSpecName: "kube-api-access-5kzsb") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "kube-api-access-5kzsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.956200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5-config-bhht7" event={"ID":"40a4a31a-66d7-491c-bb0a-e2ec83fc928c","Type":"ContainerDied","Data":"1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb"} Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.956240 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.956304 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.985950 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.985987 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.986001 4790 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.957274 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"017619036f9ca6f810c213f0b3de8f4ea5c090f4dc5c6ff264835b9dcad3c6bb"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968314 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"73fef00ea772029c19cf3c552184e3397b7e3431ac12bdeceb7ae46545b3ba70"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"7f67885f1d32551fd88f2f0c1a892a533f5cd15e4c6f0ce76ad73176eced75dc"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"20e79c8e864aea43f16daa3be101f4b3c472f442b10edd567b174e0ccc4d9ee4"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.971783 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:53 crc kubenswrapper[4790]: I0313 20:46:53.676968 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" path="/var/lib/kubelet/pods/40a4a31a-66d7-491c-bb0a-e2ec83fc928c/volumes" Mar 13 20:46:55 crc kubenswrapper[4790]: I0313 20:46:55.992350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"3f7833024195dd324060c218c2a3f5fb900ef53a89d5be73c310f68e13708b86"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.020871 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"9b444b8eca824e8f01844c07a099fab88a2db4ebaa62fe1ed779ac33cb699fe5"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"d11eadc1c5b3bb1ddb9981d59d820dadfe7f934bbdd45f8604fbabb2a3ea57bd"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"5d7b94ae5d4e238cb2860ed3a1e775bc3f7b10e31bbba0b31422281bc140b6d4"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"3d43b5ff0c1cf6471f540894b4d67fb592205c26dcf5cb8c5a0070409f1c3446"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"b2f6c7637a8cfb3deb39061dcc37aa3239822aceb779db6f1dfa5f460bdf0e54"} Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.033893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"bc32d534140b7118dc53dd1912c614b1d4f3abe74dd799cbe78cd522d23613ae"} Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.097844 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.445998070999998 podStartE2EDuration="37.097825402s" podCreationTimestamp="2026-03-13 20:46:21 +0000 UTC" firstStartedPulling="2026-03-13 20:46:38.967203199 +0000 UTC m=+1129.988319090" lastFinishedPulling="2026-03-13 20:46:55.61903053 +0000 UTC m=+1146.640146421" observedRunningTime="2026-03-13 20:46:58.096530356 +0000 UTC m=+1149.117646257" watchObservedRunningTime="2026-03-13 20:46:58.097825402 +0000 UTC m=+1149.118941293" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.380545 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:46:58 crc kubenswrapper[4790]: E0313 20:46:58.380964 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" containerName="mariadb-account-create-update" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.380985 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" containerName="mariadb-account-create-update" Mar 13 20:46:58 crc kubenswrapper[4790]: E0313 20:46:58.381006 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerName="ovn-config" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.381012 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerName="ovn-config" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.381173 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" containerName="mariadb-account-create-update" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.381198 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerName="ovn-config" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.382074 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.384617 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.396642 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487890 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487910 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.488161 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.488245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589488 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589596 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589637 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.590716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.590724 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.590928 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.591335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.591461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.609495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.712438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:59 crc kubenswrapper[4790]: I0313 20:46:59.042878 4790 generic.go:334] "Generic (PLEG): container finished" podID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerID="7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c" exitCode=0 Mar 13 20:46:59 crc kubenswrapper[4790]: I0313 20:46:59.042935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerDied","Data":"7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c"} Mar 13 20:46:59 crc kubenswrapper[4790]: W0313 20:46:59.206268 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37649c3b_ff5a_4ec3_a118_6a35d72bb4a2.slice/crio-f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40 WatchSource:0}: Error finding container f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40: Status 404 returned error can't find the container with id f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40 Mar 13 20:46:59 crc kubenswrapper[4790]: I0313 20:46:59.217481 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.051429 4790 generic.go:334] "Generic (PLEG): container finished" podID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" exitCode=0 Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.051549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerDied","Data":"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300"} Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.051652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerStarted","Data":"f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40"} Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.464965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521010 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521062 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521278 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.525927 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.529137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9" (OuterVolumeSpecName: "kube-api-access-f7mv9") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "kube-api-access-f7mv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.544174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.562007 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data" (OuterVolumeSpecName: "config-data") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622765 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622797 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622812 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622820 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.061504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerDied","Data":"e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8"} Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.061613 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.061648 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.066493 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerStarted","Data":"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8"} Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.066670 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.105534 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" podStartSLOduration=3.10551271 podStartE2EDuration="3.10551271s" podCreationTimestamp="2026-03-13 20:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:01.091336034 +0000 UTC m=+1152.112451925" watchObservedRunningTime="2026-03-13 20:47:01.10551271 +0000 UTC m=+1152.126628601" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.420589 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.451716 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:01 crc kubenswrapper[4790]: E0313 20:47:01.454898 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerName="glance-db-sync" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.454938 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerName="glance-db-sync" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.455152 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerName="glance-db-sync" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.456603 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.471044 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538839 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538940 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640369 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641809 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.658255 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.774592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.082302 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" containerID="cri-o://650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" gracePeriod=10 Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.175645 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:03 crc kubenswrapper[4790]: W0313 20:47:03.192481 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34d41874_8dfa_4e3d_9298_d027a3e3c921.slice/crio-deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8 WatchSource:0}: Error finding container deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8: Status 404 returned error can't find the container with id deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8 Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.624886 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.679947 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680129 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680178 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.687140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b" (OuterVolumeSpecName: "kube-api-access-mnz2b") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "kube-api-access-mnz2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.727194 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config" (OuterVolumeSpecName: "config") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.729907 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.734687 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.735954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.737325 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781753 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781786 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781797 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781806 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781814 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781822 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.090039 4790 generic.go:334] "Generic (PLEG): container finished" podID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerID="03a87f5d6c3388f53ac8b07b4a8345caa059485eb6f71dad3953ac168c0ce643" exitCode=0 Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.090131 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerDied","Data":"03a87f5d6c3388f53ac8b07b4a8345caa059485eb6f71dad3953ac168c0ce643"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.092183 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerStarted","Data":"deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099663 4790 generic.go:334] "Generic (PLEG): container finished" podID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" exitCode=0 Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerDied","Data":"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099747 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerDied","Data":"f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099766 4790 scope.go:117] "RemoveContainer" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.108856 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.151837 4790 scope.go:117] "RemoveContainer" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.165410 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.171828 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.183183 4790 scope.go:117] "RemoveContainer" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" Mar 13 20:47:04 crc kubenswrapper[4790]: E0313 20:47:04.183616 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8\": container with ID starting with 650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8 not found: ID does not exist" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.183668 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8"} err="failed to get container status \"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8\": rpc error: code = NotFound desc = could not find container \"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8\": container with ID starting with 650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8 not found: ID does not exist" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.183700 4790 scope.go:117] "RemoveContainer" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" Mar 13 20:47:04 crc kubenswrapper[4790]: E0313 20:47:04.188891 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300\": container with ID starting with 5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300 not found: ID does not exist" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.188941 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300"} err="failed to get container status \"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300\": rpc error: code = NotFound desc = could not find container \"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300\": container with ID starting with 5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300 not found: ID does not exist" Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.109008 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerStarted","Data":"267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942"} Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.110842 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.137083 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podStartSLOduration=4.137063481 podStartE2EDuration="4.137063481s" podCreationTimestamp="2026-03-13 20:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:05.130129352 +0000 UTC m=+1156.151245243" watchObservedRunningTime="2026-03-13 20:47:05.137063481 +0000 UTC m=+1156.158179372" Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.669574 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" path="/var/lib/kubelet/pods/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2/volumes" Mar 13 20:47:06 crc kubenswrapper[4790]: I0313 20:47:06.410661 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:47:06 crc kubenswrapper[4790]: I0313 20:47:06.521556 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.133184 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:47:08 crc kubenswrapper[4790]: E0313 20:47:08.134168 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="init" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.134229 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="init" Mar 13 20:47:08 crc kubenswrapper[4790]: E0313 20:47:08.134333 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.134402 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.134639 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.135672 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.143879 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.164118 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.164233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.241585 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.242658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.246079 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.253208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.265629 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.265707 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.266444 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.287031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.336255 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.337559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.348773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367238 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367275 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.444041 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.445448 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.448342 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.453277 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.457671 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.472947 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.472991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.473018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.473096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.473766 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.474597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.501468 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.501930 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.548077 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.549983 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.559839 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.570133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.574233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.574285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.648203 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.649219 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.653554 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.657676 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.658811 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.659028 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.659162 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.659937 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.675311 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.676578 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683371 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683566 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.685185 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.687836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.690504 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.712258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.758868 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784497 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784562 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784629 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.785944 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.811980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.880090 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.885860 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.885966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.885991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.886032 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.886063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.887126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.890038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.891263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.903851 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.904819 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.970053 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.002454 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.008750 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.152058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56s96" event={"ID":"8e11abfd-7d59-479b-9f77-cbbd22cbf48c","Type":"ContainerStarted","Data":"5b4afab25af66e7d81a8a8f191da40314dafdae52dadeffca7f74035d3ce1c8a"} Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.210753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:47:09 crc kubenswrapper[4790]: W0313 20:47:09.225929 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dd76b06_ea34_4044_bba0_cf5e6e822b6b.slice/crio-69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480 WatchSource:0}: Error finding container 69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480: Status 404 returned error can't find the container with id 69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480 Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.303413 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.449956 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:47:09 crc kubenswrapper[4790]: W0313 20:47:09.460839 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfc09f48_1b0c_45fe_be9b_8bf3a3af887c.slice/crio-2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9 WatchSource:0}: Error finding container 2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9: Status 404 returned error can't find the container with id 2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9 Mar 13 20:47:09 crc kubenswrapper[4790]: W0313 20:47:09.464613 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode551be1a_728e_4851_894c_30b4493326d6.slice/crio-fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2 WatchSource:0}: Error finding container fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2: Status 404 returned error can't find the container with id fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2 Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.506591 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.607842 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.615272 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.162557 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerStarted","Data":"02abbfffc75d71b3a16f41eb48aa257f23b07611d502300367b9d004460a9261"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.163798 4790 generic.go:334] "Generic (PLEG): container finished" podID="e551be1a-728e-4851-894c-30b4493326d6" containerID="047c96b0959e792e896cbcb062d30482e777ac7ce2334a4427efe91c5a39d9a3" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.163851 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4p54c" event={"ID":"e551be1a-728e-4851-894c-30b4493326d6","Type":"ContainerDied","Data":"047c96b0959e792e896cbcb062d30482e777ac7ce2334a4427efe91c5a39d9a3"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.163870 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4p54c" event={"ID":"e551be1a-728e-4851-894c-30b4493326d6","Type":"ContainerStarted","Data":"fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.168680 4790 generic.go:334] "Generic (PLEG): container finished" podID="fc51014c-323e-4a6b-9202-edc7b135809d" containerID="7f1ca4be311e4bf8899acd7ffc7b40f8dd562b652669b076fe646ca2df5ae15e" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.168772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d80-account-create-update-7trkt" event={"ID":"fc51014c-323e-4a6b-9202-edc7b135809d","Type":"ContainerDied","Data":"7f1ca4be311e4bf8899acd7ffc7b40f8dd562b652669b076fe646ca2df5ae15e"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.168797 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d80-account-create-update-7trkt" event={"ID":"fc51014c-323e-4a6b-9202-edc7b135809d","Type":"ContainerStarted","Data":"87a437edea37085830d79b3968bbcf7a8e5b2600d9fbf0f8b597555054038d5f"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.181730 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerID="36f3978e6e158babd7d1c6c18b804e801c1d5a860c6298e1e465b9030818d00c" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.181820 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56s96" event={"ID":"8e11abfd-7d59-479b-9f77-cbbd22cbf48c","Type":"ContainerDied","Data":"36f3978e6e158babd7d1c6c18b804e801c1d5a860c6298e1e465b9030818d00c"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.189136 4790 generic.go:334] "Generic (PLEG): container finished" podID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerID="37311d8f14a45460392cc2657752fc09be6fc325071ebe0626eb04d799e80545" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.189209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f926w" event={"ID":"1dd76b06-ea34-4044-bba0-cf5e6e822b6b","Type":"ContainerDied","Data":"37311d8f14a45460392cc2657752fc09be6fc325071ebe0626eb04d799e80545"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.189239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f926w" event={"ID":"1dd76b06-ea34-4044-bba0-cf5e6e822b6b","Type":"ContainerStarted","Data":"69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.219411 4790 generic.go:334] "Generic (PLEG): container finished" podID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerID="fb829732267d5d36436612626f2036bb0698b4bd86f5c88383f3ee7aba396142" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.219520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3bc0-account-create-update-ntn27" event={"ID":"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c","Type":"ContainerDied","Data":"fb829732267d5d36436612626f2036bb0698b4bd86f5c88383f3ee7aba396142"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.219563 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3bc0-account-create-update-ntn27" event={"ID":"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c","Type":"ContainerStarted","Data":"2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.260849 4790 generic.go:334] "Generic (PLEG): container finished" podID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerID="4f445f85254948b2a82910d93997f50d41021103d40e52ebd6447aec6a71de39" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.260913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eae0-account-create-update-ljhjl" event={"ID":"e7d496eb-3f17-4e7b-9a68-c91dec27355a","Type":"ContainerDied","Data":"4f445f85254948b2a82910d93997f50d41021103d40e52ebd6447aec6a71de39"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.260937 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eae0-account-create-update-ljhjl" event={"ID":"e7d496eb-3f17-4e7b-9a68-c91dec27355a","Type":"ContainerStarted","Data":"8ef7747d9ef2dbad4cef572c11db32bad3072a3edb1b59e19bb36fd7f24b5297"} Mar 13 20:47:11 crc kubenswrapper[4790]: I0313 20:47:11.776211 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:11 crc kubenswrapper[4790]: I0313 20:47:11.901313 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:47:11 crc kubenswrapper[4790]: I0313 20:47:11.901605 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gv56q" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" containerID="cri-o://e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb" gracePeriod=10 Mar 13 20:47:12 crc kubenswrapper[4790]: I0313 20:47:12.282137 4790 generic.go:334] "Generic (PLEG): container finished" podID="d798b6d8-8c2b-4827-81d3-09177054591f" containerID="e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb" exitCode=0 Mar 13 20:47:12 crc kubenswrapper[4790]: I0313 20:47:12.282216 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerDied","Data":"e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb"} Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.015335 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.015425 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.957024 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.965204 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012834 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"e551be1a-728e-4851-894c-30b4493326d6\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012946 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"e551be1a-728e-4851-894c-30b4493326d6\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.013108 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" (UID: "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.013240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e551be1a-728e-4851-894c-30b4493326d6" (UID: "e551be1a-728e-4851-894c-30b4493326d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.013995 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.014012 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.017707 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h" (OuterVolumeSpecName: "kube-api-access-r4q5h") pod "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" (UID: "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c"). InnerVolumeSpecName "kube-api-access-r4q5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.017818 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb" (OuterVolumeSpecName: "kube-api-access-45trb") pod "e551be1a-728e-4851-894c-30b4493326d6" (UID: "e551be1a-728e-4851-894c-30b4493326d6"). InnerVolumeSpecName "kube-api-access-45trb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.026254 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.036597 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.048888 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.064759 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.076313 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114599 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"fc51014c-323e-4a6b-9202-edc7b135809d\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114621 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114668 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"fc51014c-323e-4a6b-9202-edc7b135809d\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114837 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114874 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.115221 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.115246 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.119086 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm" (OuterVolumeSpecName: "kube-api-access-b5hkm") pod "e7d496eb-3f17-4e7b-9a68-c91dec27355a" (UID: "e7d496eb-3f17-4e7b-9a68-c91dec27355a"). InnerVolumeSpecName "kube-api-access-b5hkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.127195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n" (OuterVolumeSpecName: "kube-api-access-6vj2n") pod "1dd76b06-ea34-4044-bba0-cf5e6e822b6b" (UID: "1dd76b06-ea34-4044-bba0-cf5e6e822b6b"). InnerVolumeSpecName "kube-api-access-6vj2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.127861 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e11abfd-7d59-479b-9f77-cbbd22cbf48c" (UID: "8e11abfd-7d59-479b-9f77-cbbd22cbf48c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.128355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dd76b06-ea34-4044-bba0-cf5e6e822b6b" (UID: "1dd76b06-ea34-4044-bba0-cf5e6e822b6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.128787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc51014c-323e-4a6b-9202-edc7b135809d" (UID: "fc51014c-323e-4a6b-9202-edc7b135809d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.129578 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7d496eb-3f17-4e7b-9a68-c91dec27355a" (UID: "e7d496eb-3f17-4e7b-9a68-c91dec27355a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.131395 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj" (OuterVolumeSpecName: "kube-api-access-sd5fj") pod "8e11abfd-7d59-479b-9f77-cbbd22cbf48c" (UID: "8e11abfd-7d59-479b-9f77-cbbd22cbf48c"). InnerVolumeSpecName "kube-api-access-sd5fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.134949 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7" (OuterVolumeSpecName: "kube-api-access-t2wj7") pod "fc51014c-323e-4a6b-9202-edc7b135809d" (UID: "fc51014c-323e-4a6b-9202-edc7b135809d"). InnerVolumeSpecName "kube-api-access-t2wj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.215923 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216019 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216047 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216088 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216147 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216564 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216586 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216595 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216779 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216790 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216801 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216820 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216833 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.219876 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q" (OuterVolumeSpecName: "kube-api-access-bmp6q") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "kube-api-access-bmp6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.256489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.258476 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.260630 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config" (OuterVolumeSpecName: "config") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.261589 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.305131 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.305293 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56s96" event={"ID":"8e11abfd-7d59-479b-9f77-cbbd22cbf48c","Type":"ContainerDied","Data":"5b4afab25af66e7d81a8a8f191da40314dafdae52dadeffca7f74035d3ce1c8a"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.305347 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4afab25af66e7d81a8a8f191da40314dafdae52dadeffca7f74035d3ce1c8a" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.306859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerDied","Data":"c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.306896 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.306933 4790 scope.go:117] "RemoveContainer" containerID="e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.309809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f926w" event={"ID":"1dd76b06-ea34-4044-bba0-cf5e6e822b6b","Type":"ContainerDied","Data":"69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.309851 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.309853 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.311929 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3bc0-account-create-update-ntn27" event={"ID":"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c","Type":"ContainerDied","Data":"2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.311963 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.311972 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.317491 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eae0-account-create-update-ljhjl" event={"ID":"e7d496eb-3f17-4e7b-9a68-c91dec27355a","Type":"ContainerDied","Data":"8ef7747d9ef2dbad4cef572c11db32bad3072a3edb1b59e19bb36fd7f24b5297"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.317562 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.317570 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ef7747d9ef2dbad4cef572c11db32bad3072a3edb1b59e19bb36fd7f24b5297" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318466 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318480 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318489 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318497 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318506 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.325690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerStarted","Data":"5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.330470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4p54c" event={"ID":"e551be1a-728e-4851-894c-30b4493326d6","Type":"ContainerDied","Data":"fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.330526 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.330485 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.337031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d80-account-create-update-7trkt" event={"ID":"fc51014c-323e-4a6b-9202-edc7b135809d","Type":"ContainerDied","Data":"87a437edea37085830d79b3968bbcf7a8e5b2600d9fbf0f8b597555054038d5f"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.337066 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87a437edea37085830d79b3968bbcf7a8e5b2600d9fbf0f8b597555054038d5f" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.337149 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.343108 4790 scope.go:117] "RemoveContainer" containerID="978e68813566a9c04dd155a064a373e2649857a2eefbe05ca9b8949d3e9db280" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.349159 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.358765 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.365823 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jf9fb" podStartSLOduration=2.219555691 podStartE2EDuration="7.365808466s" podCreationTimestamp="2026-03-13 20:47:08 +0000 UTC" firstStartedPulling="2026-03-13 20:47:09.620153525 +0000 UTC m=+1160.641269416" lastFinishedPulling="2026-03-13 20:47:14.76640629 +0000 UTC m=+1165.787522191" observedRunningTime="2026-03-13 20:47:15.35712908 +0000 UTC m=+1166.378244971" watchObservedRunningTime="2026-03-13 20:47:15.365808466 +0000 UTC m=+1166.386924357" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.675904 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" path="/var/lib/kubelet/pods/d798b6d8-8c2b-4827-81d3-09177054591f/volumes" Mar 13 20:47:18 crc kubenswrapper[4790]: I0313 20:47:18.362708 4790 generic.go:334] "Generic (PLEG): container finished" podID="4214f238-4044-45ab-8e40-48894500f25f" containerID="5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134" exitCode=0 Mar 13 20:47:18 crc kubenswrapper[4790]: I0313 20:47:18.362827 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerDied","Data":"5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134"} Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.748597 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.893319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"4214f238-4044-45ab-8e40-48894500f25f\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.893364 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"4214f238-4044-45ab-8e40-48894500f25f\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.893412 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"4214f238-4044-45ab-8e40-48894500f25f\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.899142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn" (OuterVolumeSpecName: "kube-api-access-pbwsn") pod "4214f238-4044-45ab-8e40-48894500f25f" (UID: "4214f238-4044-45ab-8e40-48894500f25f"). InnerVolumeSpecName "kube-api-access-pbwsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.919022 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4214f238-4044-45ab-8e40-48894500f25f" (UID: "4214f238-4044-45ab-8e40-48894500f25f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.937257 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data" (OuterVolumeSpecName: "config-data") pod "4214f238-4044-45ab-8e40-48894500f25f" (UID: "4214f238-4044-45ab-8e40-48894500f25f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.995832 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.995873 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.995889 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.378388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerDied","Data":"02abbfffc75d71b3a16f41eb48aa257f23b07611d502300367b9d004460a9261"} Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.378426 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02abbfffc75d71b3a16f41eb48aa257f23b07611d502300367b9d004460a9261" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.378474 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615068 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615473 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="init" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615494 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="init" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615507 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615515 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615530 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615538 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615559 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4214f238-4044-45ab-8e40-48894500f25f" containerName="keystone-db-sync" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615568 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4214f238-4044-45ab-8e40-48894500f25f" containerName="keystone-db-sync" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615582 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615589 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615604 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615612 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615625 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e551be1a-728e-4851-894c-30b4493326d6" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615632 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e551be1a-728e-4851-894c-30b4493326d6" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615649 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615656 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615671 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615680 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615860 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615879 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615889 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615902 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4214f238-4044-45ab-8e40-48894500f25f" containerName="keystone-db-sync" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615930 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615944 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615957 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615967 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e551be1a-728e-4851-894c-30b4493326d6" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.616971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.629597 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.686139 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.687083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.690767 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691192 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691406 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691563 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691703 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707500 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707603 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.719331 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809677 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.810152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.810815 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.810882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.811041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.811182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.811217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812205 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.854937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.885547 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.886573 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.891584 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.892020 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6dm5h" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.892218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.907786 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.909455 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.914929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915002 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915039 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915167 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.917928 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-697p5" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.918177 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.918312 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.918463 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.943897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.944000 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.945273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.946619 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.953366 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.953834 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.959905 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.978145 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.979247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.984754 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9qb6s" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.985047 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.985210 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:20.999007 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.018068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.018986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019024 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019044 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019079 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019142 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.048453 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.069442 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.120889 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.120948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.120983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121013 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121202 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121241 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121304 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121332 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.164652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.164783 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.165465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.165916 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.182167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.189277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.206479 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.225960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226367 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226656 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.227058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.232473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.234582 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.235906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.243442 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.258357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.259811 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.260163 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.295802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.296202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.309989 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.310607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.326321 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328916 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.329023 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.329126 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.329260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.424013 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453696 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453992 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.456584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.459552 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.463577 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.476062 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.484038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2zqc7" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486513 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486580 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486598 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.500940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.500997 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.504357 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.513910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.518919 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.519146 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dwzcz" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.519418 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.519578 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.520338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.538316 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.540043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.547858 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.548099 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.548219 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2nkvl" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559748 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559894 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.567243 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.597434 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.655946 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.656322 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663291 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663357 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663409 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663451 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.671824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.674028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.675547 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.680623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.681956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.682609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.685083 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.689557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.691979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.696647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.698368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.704447 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.737822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.757161 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.758813 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765336 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765371 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765510 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.795567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.827218 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.830010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.837192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.842784 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.847966 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.851400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.859698 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.864863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867423 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867490 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.868164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.871460 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.872112 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.876009 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.877244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.888469 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.909741 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.909813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.927422 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.969972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970150 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970192 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970210 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970232 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970315 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970577 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970933 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970962 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.971128 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.974571 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.975240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.975820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.976658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.977745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.997405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.078671 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.078780 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.078809 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.080303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.082998 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.083063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.085864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.086518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.086638 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.091693 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092246 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092442 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092465 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092513 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.093304 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.098924 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.100291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.100716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.111318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.112028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.115940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.122589 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.138046 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.155447 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.193512 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.206046 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.208204 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.225608 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef97bfb_4275_4a0a_bae4_5442cf7400dd.slice/crio-e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0 WatchSource:0}: Error finding container e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0: Status 404 returned error can't find the container with id e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0 Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.248655 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f7f54d0_0f93_497b_b5cb_2a35d7dc68f6.slice/crio-5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a WatchSource:0}: Error finding container 5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a: Status 404 returned error can't find the container with id 5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.323969 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.339342 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32ffb609_7a3b_42b7_b513_7003deefe5dd.slice/crio-593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd WatchSource:0}: Error finding container 593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd: Status 404 returned error can't find the container with id 593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.368851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.399427 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.404690 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.427870 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1abdfade_817b_4659_b8be_48bb516fb866.slice/crio-3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6 WatchSource:0}: Error finding container 3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6: Status 404 returned error can't find the container with id 3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6 Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.434551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerStarted","Data":"593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.435759 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58656c768f-spczn" event={"ID":"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6","Type":"ContainerStarted","Data":"5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.436986 4790 generic.go:334] "Generic (PLEG): container finished" podID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerID="0747b757edc09b2a27e6d814254501a0191a898c87d268e337ba01775251ef0e" exitCode=0 Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.437035 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" event={"ID":"016f2a5b-9c42-4f7b-bf5f-42eb5010b321","Type":"ContainerDied","Data":"0747b757edc09b2a27e6d814254501a0191a898c87d268e337ba01775251ef0e"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.437051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" event={"ID":"016f2a5b-9c42-4f7b-bf5f-42eb5010b321","Type":"ContainerStarted","Data":"44f1620af41208c3efd1e9ed5400e4f2a150da135c8bb09cbfabc990bfdce7d6"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.439347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerStarted","Data":"be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.439600 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerStarted","Data":"f6dce01b5701bc8518d8b3503d0cb256f3f959202a6e0fa8b6a41a1cef8da1af"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.444892 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerStarted","Data":"e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.501812 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rfr4j" podStartSLOduration=2.501787926 podStartE2EDuration="2.501787926s" podCreationTimestamp="2026-03-13 20:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:22.478914622 +0000 UTC m=+1173.500030513" watchObservedRunningTime="2026-03-13 20:47:22.501787926 +0000 UTC m=+1173.522903817" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.515336 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.655687 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.657168 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc42e6e_503c_4931_87e1_adcbf3469570.slice/crio-6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d WatchSource:0}: Error finding container 6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d: Status 404 returned error can't find the container with id 6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.664716 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.841091 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.996200 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.072351 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:23 crc kubenswrapper[4790]: W0313 20:47:23.074863 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb1b2e8_4b05_411b_a540_6507fdd5775f.slice/crio-0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898 WatchSource:0}: Error finding container 0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898: Status 404 returned error can't find the container with id 0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898 Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123296 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123358 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123752 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123909 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.153170 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k" (OuterVolumeSpecName: "kube-api-access-qth2k") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "kube-api-access-qth2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.154038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.154073 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.168074 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.176963 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.190728 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.208924 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config" (OuterVolumeSpecName: "config") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225754 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225811 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225850 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225863 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225874 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225885 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.465352 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.499212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerStarted","Data":"ae6303d4ad793ad0e64f0d47ea54d176c052de04140f8c58d197bb273bafc45e"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.544638 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.544915 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerID="365da7d7e0570f42f94165ed1add103834755db474d98891c1b86296bdc4478f" exitCode=0 Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.545003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerDied","Data":"365da7d7e0570f42f94165ed1add103834755db474d98891c1b86296bdc4478f"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.545028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerStarted","Data":"6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.559482 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.564169 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:23 crc kubenswrapper[4790]: E0313 20:47:23.567313 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerName="init" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.567335 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerName="init" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.567551 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerName="init" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.587850 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.595547 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.627138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" event={"ID":"016f2a5b-9c42-4f7b-bf5f-42eb5010b321","Type":"ContainerDied","Data":"44f1620af41208c3efd1e9ed5400e4f2a150da135c8bb09cbfabc990bfdce7d6"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.627239 4790 scope.go:117] "RemoveContainer" containerID="0747b757edc09b2a27e6d814254501a0191a898c87d268e337ba01775251ef0e" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.627446 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.655095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerStarted","Data":"66bbb0c4358595b69723c41e22c295dc43c704b8a97a66ffe918a90a7b96cb73"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.661572 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.715288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.723729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerStarted","Data":"0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.733695 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerStarted","Data":"0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755347 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755430 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.800869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerStarted","Data":"fea6afc911ca7e2dd3477729e74613955f874b4583f2a3acc5c3410afdd753e9"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.820299 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.834777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerStarted","Data":"a21e9d9d91f185d12ea208152b90be684a6632b64a67e174026d61018c3b2d9d"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.856976 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857291 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.858764 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.860089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.865306 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mg4xg" podStartSLOduration=3.865289165 podStartE2EDuration="3.865289165s" podCreationTimestamp="2026-03-13 20:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:23.772882454 +0000 UTC m=+1174.793998345" watchObservedRunningTime="2026-03-13 20:47:23.865289165 +0000 UTC m=+1174.886405046" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.884144 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.894784 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.947775 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.574838 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.854602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerStarted","Data":"f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.870323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerStarted","Data":"a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.871043 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.873923 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerStarted","Data":"3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.876313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54dbf7ffd5-z6rf5" event={"ID":"bdc44913-44bd-4899-8f7b-d4908bad33c3","Type":"ContainerStarted","Data":"bb504ca44dce1509b49f6335cf8ea3f3e23e6aefb5e7712baa5b25d4cf19fcdc"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.904187 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" podStartSLOduration=3.9041639999999997 podStartE2EDuration="3.904164s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:24.897819347 +0000 UTC m=+1175.918935268" watchObservedRunningTime="2026-03-13 20:47:24.904164 +0000 UTC m=+1175.925279901" Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.674278 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" path="/var/lib/kubelet/pods/016f2a5b-9c42-4f7b-bf5f-42eb5010b321/volumes" Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.896891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerStarted","Data":"15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24"} Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.897088 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" containerID="cri-o://3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.897651 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" containerID="cri-o://15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.901967 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" containerID="cri-o://f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.902052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerStarted","Data":"d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2"} Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.902299 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" containerID="cri-o://d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.941581 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.9415629039999995 podStartE2EDuration="4.941562904s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:25.922236847 +0000 UTC m=+1176.943352738" watchObservedRunningTime="2026-03-13 20:47:25.941562904 +0000 UTC m=+1176.962678795" Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.943268 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.943258261 podStartE2EDuration="4.943258261s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:25.939646141 +0000 UTC m=+1176.960762062" watchObservedRunningTime="2026-03-13 20:47:25.943258261 +0000 UTC m=+1176.964374152" Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919292 4790 generic.go:334] "Generic (PLEG): container finished" podID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerID="d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2" exitCode=0 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919651 4790 generic.go:334] "Generic (PLEG): container finished" podID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerID="f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8" exitCode=143 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerDied","Data":"d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2"} Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerDied","Data":"f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8"} Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927352 4790 generic.go:334] "Generic (PLEG): container finished" podID="9acddcdc-720a-469a-8023-7762f1b7c025" containerID="15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24" exitCode=0 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927398 4790 generic.go:334] "Generic (PLEG): container finished" podID="9acddcdc-720a-469a-8023-7762f1b7c025" containerID="3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193" exitCode=143 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerDied","Data":"15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24"} Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927445 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerDied","Data":"3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193"} Mar 13 20:47:28 crc kubenswrapper[4790]: I0313 20:47:28.953638 4790 generic.go:334] "Generic (PLEG): container finished" podID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerID="be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07" exitCode=0 Mar 13 20:47:28 crc kubenswrapper[4790]: I0313 20:47:28.954397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerDied","Data":"be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07"} Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.055782 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.098600 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.100143 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.104005 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.122624 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133555 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133836 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133885 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.136790 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.170189 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-686b857b8-6fghv"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.171587 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.194637 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686b857b8-6fghv"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f5105d-51ea-4e5e-832f-8302188a943a-logs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235226 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-config-data\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-scripts\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235583 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-tls-certs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-secret-key\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235729 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-combined-ca-bundle\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235822 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-755d7\" (UniqueName: \"kubernetes.io/projected/d0f5105d-51ea-4e5e-832f-8302188a943a-kube-api-access-755d7\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.236748 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.236807 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.237517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.246593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.254811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.263707 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.264350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-secret-key\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-combined-ca-bundle\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-755d7\" (UniqueName: \"kubernetes.io/projected/d0f5105d-51ea-4e5e-832f-8302188a943a-kube-api-access-755d7\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337745 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f5105d-51ea-4e5e-832f-8302188a943a-logs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-config-data\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-scripts\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337869 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-tls-certs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.338233 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f5105d-51ea-4e5e-832f-8302188a943a-logs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.339216 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-scripts\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.339401 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-config-data\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.341183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-combined-ca-bundle\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.344423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-secret-key\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.345752 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-tls-certs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.362564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-755d7\" (UniqueName: \"kubernetes.io/projected/d0f5105d-51ea-4e5e-832f-8302188a943a-kube-api-access-755d7\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.427674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.494004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.094035 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.168764 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.169016 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" containerID="cri-o://267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942" gracePeriod=10 Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.994342 4790 generic.go:334] "Generic (PLEG): container finished" podID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerID="267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942" exitCode=0 Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.994400 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerDied","Data":"267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942"} Mar 13 20:47:36 crc kubenswrapper[4790]: I0313 20:47:36.776078 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.007601 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.028455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerDied","Data":"fea6afc911ca7e2dd3477729e74613955f874b4583f2a3acc5c3410afdd753e9"} Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.028510 4790 scope.go:117] "RemoveContainer" containerID="d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.028535 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.193039 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194630 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194873 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194990 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195068 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195153 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195642 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs" (OuterVolumeSpecName: "logs") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195657 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.200557 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts" (OuterVolumeSpecName: "scripts") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.212591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w" (OuterVolumeSpecName: "kube-api-access-6cj5w") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "kube-api-access-6cj5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.214559 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.221970 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.239912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.249087 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data" (OuterVolumeSpecName: "config-data") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297576 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297621 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297637 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297652 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297664 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297702 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297717 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.322275 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.365314 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.378285 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391150 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: E0313 20:47:37.391649 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391664 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" Mar 13 20:47:37 crc kubenswrapper[4790]: E0313 20:47:37.391716 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391726 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391956 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391979 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.393112 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.401916 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.413434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.413691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.432962 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.445866 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.533943 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534263 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534337 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.535084 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.535155 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs" (OuterVolumeSpecName: "logs") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537309 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537400 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537475 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537513 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537703 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537717 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.540661 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts" (OuterVolumeSpecName: "scripts") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.543028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2" (OuterVolumeSpecName: "kube-api-access-rm7n2") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "kube-api-access-rm7n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.564072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.569458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.611915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data" (OuterVolumeSpecName: "config-data") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.632265 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646418 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646601 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646730 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646740 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646751 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646772 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646780 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646789 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.655200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.655757 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.656070 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.661346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.668357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.674540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.678360 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.678994 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.683435 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" path="/var/lib/kubelet/pods/315f7d58-8f13-4982-a1d6-25b3773f0b1a/volumes" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.684923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.694312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.739237 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.750368 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.037903 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerDied","Data":"ae6303d4ad793ad0e64f0d47ea54d176c052de04140f8c58d197bb273bafc45e"} Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.039061 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.070011 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.088048 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.103475 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: E0313 20:47:38.103931 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.103944 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" Mar 13 20:47:38 crc kubenswrapper[4790]: E0313 20:47:38.103972 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.103978 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.104131 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.104144 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.105149 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.110896 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.111180 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.118259 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164074 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164192 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164254 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164344 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164478 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164518 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266226 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266398 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266459 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266486 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266550 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266970 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.267213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.267590 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273088 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.282610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.291687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.431785 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:39 crc kubenswrapper[4790]: I0313 20:47:39.672636 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" path="/var/lib/kubelet/pods/9acddcdc-720a-469a-8023-7762f1b7c025/volumes" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.563807 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.563980 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8h694h675h665h85h67ch86h97h5h66dhc6h6h9fhb6h8hc5h5cfh67fh577h6dhdch578h69h58fh594h5b7h56dhdbhc8h68bh645h55q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdq79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1abdfade-817b-4659-b8be-48bb516fb866): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.583641 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.583841 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n665h5ch5bh66dh66bh94h67h9fh5d6h5d4h56dh58ch7bh697h8h99h74hbch569h5dhbfh54bh64dh5c4h59h84h59dh589h5b5h58hf4h5b8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2wvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-58656c768f-spczn_openstack(4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.585874 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-58656c768f-spczn" podUID="4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.597721 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.597909 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n595h4h568h5ch68fh5fbh588h645h8ch64h694h666h5fch586h65bh546h564h699h645h5fbh69h696h6chbch5b7h5f6hdch9ch5cfh8dh88h8dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lq4pr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-54dbf7ffd5-z6rf5_openstack(bdc44913-44bd-4899-8f7b-d4908bad33c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.600784 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-54dbf7ffd5-z6rf5" podUID="bdc44913-44bd-4899-8f7b-d4908bad33c3" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.629167 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708732 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708873 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708921 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708997 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.709114 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.709244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.714113 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts" (OuterVolumeSpecName: "scripts") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.714352 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d" (OuterVolumeSpecName: "kube-api-access-kcf2d") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "kube-api-access-kcf2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.721011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.731001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.736000 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data" (OuterVolumeSpecName: "config-data") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.744805 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811770 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811805 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811814 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811822 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811831 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811839 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.065486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerDied","Data":"f6dce01b5701bc8518d8b3503d0cb256f3f959202a6e0fa8b6a41a1cef8da1af"} Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.065854 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6dce01b5701bc8518d8b3503d0cb256f3f959202a6e0fa8b6a41a1cef8da1af" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.065660 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.707241 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.713950 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.798359 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:47:41 crc kubenswrapper[4790]: E0313 20:47:41.798897 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerName="keystone-bootstrap" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.798923 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerName="keystone-bootstrap" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.799123 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerName="keystone-bootstrap" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.799863 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802524 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802670 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802828 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802859 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802931 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.811851 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850828 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850935 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.851003 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.851081 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.952692 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953135 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953289 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953434 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.958920 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.959015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.959127 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.959137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.967754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.969104 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:42 crc kubenswrapper[4790]: I0313 20:47:42.128843 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:43 crc kubenswrapper[4790]: I0313 20:47:43.674155 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" path="/var/lib/kubelet/pods/5c397c6e-8d19-4b92-bc31-61312531b3d9/volumes" Mar 13 20:47:44 crc kubenswrapper[4790]: I0313 20:47:44.016050 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:47:44 crc kubenswrapper[4790]: I0313 20:47:44.016120 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:47:46 crc kubenswrapper[4790]: I0313 20:47:46.775978 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.134428 4790 generic.go:334] "Generic (PLEG): container finished" podID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerID="0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca" exitCode=0 Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.134487 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerDied","Data":"0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca"} Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.632441 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831645 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831719 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831813 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831871 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.850418 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9" (OuterVolumeSpecName: "kube-api-access-rxcd9") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "kube-api-access-rxcd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.880458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.892733 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.894251 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config" (OuterVolumeSpecName: "config") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.904101 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.906684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934891 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934930 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934941 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934950 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934958 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934968 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.090465 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.090650 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tw2bf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-kkmzk_openstack(5dff6930-5d07-4df7-8d42-470ae83afd38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.091859 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-kkmzk" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.118267 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.124989 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.152287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54dbf7ffd5-z6rf5" event={"ID":"bdc44913-44bd-4899-8f7b-d4908bad33c3","Type":"ContainerDied","Data":"bb504ca44dce1509b49f6335cf8ea3f3e23e6aefb5e7712baa5b25d4cf19fcdc"} Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.152316 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.155782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerDied","Data":"deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8"} Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.155864 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.161577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58656c768f-spczn" event={"ID":"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6","Type":"ContainerDied","Data":"5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a"} Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.161813 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.164992 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-kkmzk" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.211957 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.222858 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239124 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239282 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239370 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239466 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239512 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239537 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239573 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239683 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs" (OuterVolumeSpecName: "logs") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239986 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240206 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data" (OuterVolumeSpecName: "config-data") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240542 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240571 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data" (OuterVolumeSpecName: "config-data") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240842 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs" (OuterVolumeSpecName: "logs") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240895 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts" (OuterVolumeSpecName: "scripts") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.241356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts" (OuterVolumeSpecName: "scripts") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244059 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs" (OuterVolumeSpecName: "kube-api-access-v2wvs") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "kube-api-access-v2wvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244093 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr" (OuterVolumeSpecName: "kube-api-access-lq4pr") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "kube-api-access-lq4pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244548 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341590 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341617 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341628 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341647 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341656 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341666 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341674 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341683 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.524151 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.536497 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.554026 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.560174 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.310516 4790 scope.go:117] "RemoveContainer" containerID="f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8" Mar 13 20:47:49 crc kubenswrapper[4790]: E0313 20:47:49.342091 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 13 20:47:49 crc kubenswrapper[4790]: E0313 20:47:49.342286 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mdhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-g2nmn_openstack(32ffb609-7a3b-42b7-b513-7003deefe5dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:49 crc kubenswrapper[4790]: E0313 20:47:49.343580 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-g2nmn" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.431267 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.490011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.490051 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.490118 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.506080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6" (OuterVolumeSpecName: "kube-api-access-llgn6") pod "eef97bfb-4275-4a0a-bae4-5442cf7400dd" (UID: "eef97bfb-4275-4a0a-bae4-5442cf7400dd"). InnerVolumeSpecName "kube-api-access-llgn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.516040 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef97bfb-4275-4a0a-bae4-5442cf7400dd" (UID: "eef97bfb-4275-4a0a-bae4-5442cf7400dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.522270 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config" (OuterVolumeSpecName: "config") pod "eef97bfb-4275-4a0a-bae4-5442cf7400dd" (UID: "eef97bfb-4275-4a0a-bae4-5442cf7400dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.591572 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.591629 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.591646 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.674074 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" path="/var/lib/kubelet/pods/34d41874-8dfa-4e3d-9298-d027a3e3c921/volumes" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.674930 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" path="/var/lib/kubelet/pods/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6/volumes" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.675459 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc44913-44bd-4899-8f7b-d4908bad33c3" path="/var/lib/kubelet/pods/bdc44913-44bd-4899-8f7b-d4908bad33c3/volumes" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.798553 4790 scope.go:117] "RemoveContainer" containerID="15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.907796 4790 scope.go:117] "RemoveContainer" containerID="3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.994436 4790 scope.go:117] "RemoveContainer" containerID="267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.045757 4790 scope.go:117] "RemoveContainer" containerID="03a87f5d6c3388f53ac8b07b4a8345caa059485eb6f71dad3953ac168c0ce643" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.152562 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.184404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerDied","Data":"e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0"} Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.184441 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.184506 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.190246 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerStarted","Data":"32071f4748bdbdbbb2169f1b2a9fc194d9a40accb2c6784c59874d08e8b9f3b6"} Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.191304 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-g2nmn" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.270110 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686b857b8-6fghv"] Mar 13 20:47:50 crc kubenswrapper[4790]: W0313 20:47:50.272484 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f5105d_51ea_4e5e_832f_8302188a943a.slice/crio-6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221 WatchSource:0}: Error finding container 6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221: Status 404 returned error can't find the container with id 6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221 Mar 13 20:47:50 crc kubenswrapper[4790]: W0313 20:47:50.354769 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod773bad92_580e_4a9c_9ba5_eef9d8bbc40d.slice/crio-b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd WatchSource:0}: Error finding container b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd: Status 404 returned error can't find the container with id b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.358506 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.373757 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.392155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 20:47:50 crc kubenswrapper[4790]: W0313 20:47:50.517054 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ed0eb88_051d_48ad_a934_3cfb7dbcd0f2.slice/crio-ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5 WatchSource:0}: Error finding container ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5: Status 404 returned error can't find the container with id ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5 Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.518368 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.735639 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.736098 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736112 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.736125 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="init" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736133 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="init" Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.736150 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerName="neutron-db-sync" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736158 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerName="neutron-db-sync" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736352 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerName="neutron-db-sync" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736371 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.738079 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.757044 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821273 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821577 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.885699 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.887002 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.891677 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6dm5h" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.891933 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.898298 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.913773 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.923431 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924403 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924501 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.925322 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.934878 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.935697 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.936700 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.935549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.973325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029313 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.130973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.134993 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.137092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.137227 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.137662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.148456 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.156825 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.190163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.207111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerStarted","Data":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.214502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerStarted","Data":"de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.214537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerStarted","Data":"c83d34d6ac5900a556ff1a044d46cc3895eff153b6e266aeaf785a861d60ccbe"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.234677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.243222 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerStarted","Data":"ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.246709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerStarted","Data":"b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.249090 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686b857b8-6fghv" event={"ID":"d0f5105d-51ea-4e5e-832f-8302188a943a","Type":"ContainerStarted","Data":"f1a3bc5c28291092364d29567ef31da8206a941647d2233217845f100a2524d3"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.249132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686b857b8-6fghv" event={"ID":"d0f5105d-51ea-4e5e-832f-8302188a943a","Type":"ContainerStarted","Data":"6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.274642 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerStarted","Data":"51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.284026 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerStarted","Data":"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.314253 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wbb8v" podStartSLOduration=5.082679712 podStartE2EDuration="30.314233089s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="2026-03-13 20:47:22.874153372 +0000 UTC m=+1173.895269263" lastFinishedPulling="2026-03-13 20:47:48.105706749 +0000 UTC m=+1199.126822640" observedRunningTime="2026-03-13 20:47:51.302636563 +0000 UTC m=+1202.323752454" watchObservedRunningTime="2026-03-13 20:47:51.314233089 +0000 UTC m=+1202.335348980" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.771635 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.777198 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Mar 13 20:47:52 crc kubenswrapper[4790]: W0313 20:47:52.116470 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f53d5c_8b27_4810_a760_f7c9a4ee567b.slice/crio-244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0 WatchSource:0}: Error finding container 244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0: Status 404 returned error can't find the container with id 244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.117413 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.293530 4790 generic.go:334] "Generic (PLEG): container finished" podID="6792eda6-a284-42ab-a650-f21b012f7f44" containerID="c85d717e10fb599c6b3d50e3cfc797654ac9faa539262c4f8824bda9117967e3" exitCode=0 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.293601 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerDied","Data":"c85d717e10fb599c6b3d50e3cfc797654ac9faa539262c4f8824bda9117967e3"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.293849 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerStarted","Data":"eb26cba5d4f1f28cf0c444cc204a575c1f7bb95d1f8f9337a19506bba53fe819"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.304510 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686b857b8-6fghv" event={"ID":"d0f5105d-51ea-4e5e-832f-8302188a943a","Type":"ContainerStarted","Data":"29f470a4b7bd1504791f24aa94bc2135d812fc6b51bc685b80f06c023dc9d304"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.309731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerStarted","Data":"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.323284 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerStarted","Data":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.323526 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76485b6c5-pjfp4" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" containerID="cri-o://f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" gracePeriod=30 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.323755 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76485b6c5-pjfp4" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" containerID="cri-o://45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" gracePeriod=30 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.339136 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerStarted","Data":"244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.344236 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerStarted","Data":"4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.347477 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77655f674d-4r7h4" podStartSLOduration=22.34745474 podStartE2EDuration="22.34745474s" podCreationTimestamp="2026-03-13 20:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:52.335576477 +0000 UTC m=+1203.356692368" watchObservedRunningTime="2026-03-13 20:47:52.34745474 +0000 UTC m=+1203.368570631" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.360262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerStarted","Data":"6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.370315 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.370556 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-686b857b8-6fghv" podStartSLOduration=22.370531699 podStartE2EDuration="22.370531699s" podCreationTimestamp="2026-03-13 20:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:52.354149383 +0000 UTC m=+1203.375265284" watchObservedRunningTime="2026-03-13 20:47:52.370531699 +0000 UTC m=+1203.391647590" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.397009 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76485b6c5-pjfp4" podStartSLOduration=6.364096975 podStartE2EDuration="31.393352793s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="2026-03-13 20:47:23.076880732 +0000 UTC m=+1174.097996623" lastFinishedPulling="2026-03-13 20:47:48.10613655 +0000 UTC m=+1199.127252441" observedRunningTime="2026-03-13 20:47:52.381862038 +0000 UTC m=+1203.402977929" watchObservedRunningTime="2026-03-13 20:47:52.393352793 +0000 UTC m=+1203.414468684" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.420715 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m4zxn" podStartSLOduration=11.420692338 podStartE2EDuration="11.420692338s" podCreationTimestamp="2026-03-13 20:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:52.404003812 +0000 UTC m=+1203.425119703" watchObservedRunningTime="2026-03-13 20:47:52.420692338 +0000 UTC m=+1203.441808229" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.166539 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.168635 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.173743 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.174094 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.180878 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291330 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291392 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291533 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.373443 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerStarted","Data":"8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.373753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerStarted","Data":"449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.374048 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.386906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerStarted","Data":"96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.389826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerStarted","Data":"5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.392793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerStarted","Data":"de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393953 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393989 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.394015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.394041 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.409233 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5fc7fb5bf6-ctr9l" podStartSLOduration=3.409212899 podStartE2EDuration="3.409212899s" podCreationTimestamp="2026-03-13 20:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.395842014 +0000 UTC m=+1204.416957915" watchObservedRunningTime="2026-03-13 20:47:53.409212899 +0000 UTC m=+1204.430328790" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.409293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.410917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.411687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.418315 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.431084 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.441243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.455074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.487980 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.487955657 podStartE2EDuration="15.487955657s" podCreationTimestamp="2026-03-13 20:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.44148383 +0000 UTC m=+1204.462599741" watchObservedRunningTime="2026-03-13 20:47:53.487955657 +0000 UTC m=+1204.509071548" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.489203 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" podStartSLOduration=3.489194901 podStartE2EDuration="3.489194901s" podCreationTimestamp="2026-03-13 20:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.484870533 +0000 UTC m=+1204.505986424" watchObservedRunningTime="2026-03-13 20:47:53.489194901 +0000 UTC m=+1204.510310802" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.531019 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.530998251 podStartE2EDuration="16.530998251s" podCreationTimestamp="2026-03-13 20:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.522553961 +0000 UTC m=+1204.543669852" watchObservedRunningTime="2026-03-13 20:47:53.530998251 +0000 UTC m=+1204.552114142" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.543747 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.192789 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:47:54 crc kubenswrapper[4790]: W0313 20:47:54.205990 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebc86632_179c_403a_bbdd_d496a21c018c.slice/crio-84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc WatchSource:0}: Error finding container 84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc: Status 404 returned error can't find the container with id 84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.408869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerStarted","Data":"84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc"} Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.411858 4790 generic.go:334] "Generic (PLEG): container finished" podID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerID="51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196" exitCode=0 Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.412840 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerDied","Data":"51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196"} Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.412882 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.156350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288771 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288815 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288909 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.289778 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs" (OuterVolumeSpecName: "logs") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.295830 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8" (OuterVolumeSpecName: "kube-api-access-6s8z8") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "kube-api-access-6s8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.296491 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts" (OuterVolumeSpecName: "scripts") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.316294 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.318580 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data" (OuterVolumeSpecName: "config-data") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391548 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391594 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391608 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391620 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391632 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.444517 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerStarted","Data":"9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0"} Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.446921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerDied","Data":"a21e9d9d91f185d12ea208152b90be684a6632b64a67e174026d61018c3b2d9d"} Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.446965 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21e9d9d91f185d12ea208152b90be684a6632b64a67e174026d61018c3b2d9d" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.447014 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.739636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.739694 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.772741 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.782571 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.255197 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:47:58 crc kubenswrapper[4790]: E0313 20:47:58.255764 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerName="placement-db-sync" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.255788 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerName="placement-db-sync" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.256026 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerName="placement-db-sync" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.257298 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.259759 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.259895 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2nkvl" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.260164 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.261091 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.261508 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.278051 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311358 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311416 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311460 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412763 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412829 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412901 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.413587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.418937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.423676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.423905 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.427316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.429610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.429813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.433137 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.434015 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.467888 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerID="de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb" exitCode=0 Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.469132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerDied","Data":"de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb"} Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.469175 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.470249 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.470394 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.544430 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.581494 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:59 crc kubenswrapper[4790]: I0313 20:47:59.479831 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:47:59 crc kubenswrapper[4790]: I0313 20:47:59.479955 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.156713 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.163312 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.168899 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.168899 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.173272 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.179108 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.260550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"auto-csr-approver-29557248-msw96\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.365694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"auto-csr-approver-29557248-msw96\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.412089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"auto-csr-approver-29557248-msw96\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.428488 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.428527 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.488688 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.492270 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.492295 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.497485 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.498477 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.925136 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.159648 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.228076 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.228528 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" containerID="cri-o://a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77" gracePeriod=10 Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.505972 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerID="a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77" exitCode=0 Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.506067 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.506052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerDied","Data":"a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77"} Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.572963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.892572 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.892680 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:02 crc kubenswrapper[4790]: I0313 20:48:02.093544 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Mar 13 20:48:02 crc kubenswrapper[4790]: I0313 20:48:02.305312 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.363667 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450512 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450572 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450610 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450659 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450742 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450840 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.458002 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f" (OuterVolumeSpecName: "kube-api-access-ghp5f") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "kube-api-access-ghp5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.458729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.459356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts" (OuterVolumeSpecName: "scripts") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.460456 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.484427 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.484456 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data" (OuterVolumeSpecName: "config-data") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.542686 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerDied","Data":"c83d34d6ac5900a556ff1a044d46cc3895eff153b6e266aeaf785a861d60ccbe"} Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.542722 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c83d34d6ac5900a556ff1a044d46cc3895eff153b6e266aeaf785a861d60ccbe" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.542795 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554816 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554856 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554868 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554881 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554894 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554908 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.752341 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858588 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858671 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858785 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.859402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.859472 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.864804 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977" (OuterVolumeSpecName: "kube-api-access-t7977") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "kube-api-access-t7977". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.902806 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.984712 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.987394 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.995173 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.005968 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.016939 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.019351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.041005 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config" (OuterVolumeSpecName: "config") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091430 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091457 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091468 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091477 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091485 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479355 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c5788df58-llnz4"] Mar 13 20:48:04 crc kubenswrapper[4790]: E0313 20:48:04.479934 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerName="keystone-bootstrap" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479945 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerName="keystone-bootstrap" Mar 13 20:48:04 crc kubenswrapper[4790]: E0313 20:48:04.479965 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479971 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" Mar 13 20:48:04 crc kubenswrapper[4790]: E0313 20:48:04.479983 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="init" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479990 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="init" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.480159 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerName="keystone-bootstrap" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.480187 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.489158 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.506420 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.506668 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.507740 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.507813 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.508090 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.508296 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.547931 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5788df58-llnz4"] Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.577168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerStarted","Data":"062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.596719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.600090 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerStarted","Data":"f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.600166 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerStarted","Data":"5d216af4785a04f3e8536b6945d51a46024ad4cfced21083156e56a883fa3cab"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.629135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-credential-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.629502 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-internal-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.629673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-config-data\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-combined-ca-bundle\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630532 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-fernet-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-scripts\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630759 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmcw\" (UniqueName: \"kubernetes.io/projected/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-kube-api-access-vpmcw\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630893 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-public-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.635398 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-kkmzk" podStartSLOduration=2.039330358 podStartE2EDuration="43.635366208s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="2026-03-13 20:47:22.546774133 +0000 UTC m=+1173.567890024" lastFinishedPulling="2026-03-13 20:48:04.142809993 +0000 UTC m=+1215.163925874" observedRunningTime="2026-03-13 20:48:04.628090299 +0000 UTC m=+1215.649206190" watchObservedRunningTime="2026-03-13 20:48:04.635366208 +0000 UTC m=+1215.656482099" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.638565 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerStarted","Data":"f77f483d75213eae4864a3d19aa92203b67c406a1011bada9c9ab22419c8844d"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.664813 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerStarted","Data":"abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.667462 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.691126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerDied","Data":"6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.691169 4790 scope.go:117] "RemoveContainer" containerID="a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.691304 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.707752 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-655d56d4d9-rckws" podStartSLOduration=11.707733171 podStartE2EDuration="11.707733171s" podCreationTimestamp="2026-03-13 20:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:04.70034653 +0000 UTC m=+1215.721462411" watchObservedRunningTime="2026-03-13 20:48:04.707733171 +0000 UTC m=+1215.728849062" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.719278 4790 scope.go:117] "RemoveContainer" containerID="365da7d7e0570f42f94165ed1add103834755db474d98891c1b86296bdc4478f" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.732173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-credential-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.732212 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-internal-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-config-data\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733075 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-combined-ca-bundle\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-fernet-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-scripts\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733159 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmcw\" (UniqueName: \"kubernetes.io/projected/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-kube-api-access-vpmcw\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-public-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.739813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-combined-ca-bundle\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.747651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-public-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.751827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-config-data\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.753827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-credential-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.755897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-scripts\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.756557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-fernet-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.759704 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-internal-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.761535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmcw\" (UniqueName: \"kubernetes.io/projected/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-kube-api-access-vpmcw\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.781694 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.796884 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.854635 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.366064 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5788df58-llnz4"] Mar 13 20:48:05 crc kubenswrapper[4790]: W0313 20:48:05.371529 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c3cfa50_a4b5_45e0_9cb4_d6a5495f4fb7.slice/crio-76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6 WatchSource:0}: Error finding container 76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6: Status 404 returned error can't find the container with id 76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6 Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.670094 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" path="/var/lib/kubelet/pods/7dc42e6e-503c-4931-87e1-adcbf3469570/volumes" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.704418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5788df58-llnz4" event={"ID":"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7","Type":"ContainerStarted","Data":"35f97c0eddce4a6fe454ce018026e47dd5647b67c41351be553885eff30838d2"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.704454 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5788df58-llnz4" event={"ID":"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7","Type":"ContainerStarted","Data":"76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.705353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.707649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerStarted","Data":"963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.707797 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.721238 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.727319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerStarted","Data":"3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.730519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerStarted","Data":"f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.750410 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c5788df58-llnz4" podStartSLOduration=1.750366729 podStartE2EDuration="1.750366729s" podCreationTimestamp="2026-03-13 20:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:05.741770514 +0000 UTC m=+1216.762886415" watchObservedRunningTime="2026-03-13 20:48:05.750366729 +0000 UTC m=+1216.771482620" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.761507 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557248-msw96" podStartSLOduration=4.692809664 podStartE2EDuration="5.761486761s" podCreationTimestamp="2026-03-13 20:48:00 +0000 UTC" firstStartedPulling="2026-03-13 20:48:03.982866581 +0000 UTC m=+1215.003982472" lastFinishedPulling="2026-03-13 20:48:05.051543678 +0000 UTC m=+1216.072659569" observedRunningTime="2026-03-13 20:48:05.760449893 +0000 UTC m=+1216.781565784" watchObservedRunningTime="2026-03-13 20:48:05.761486761 +0000 UTC m=+1216.782602652" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.779357 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6cd9b448d6-w8fcr" podStartSLOduration=7.779343879 podStartE2EDuration="7.779343879s" podCreationTimestamp="2026-03-13 20:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:05.775779972 +0000 UTC m=+1216.796895863" watchObservedRunningTime="2026-03-13 20:48:05.779343879 +0000 UTC m=+1216.800459770" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.810027 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-g2nmn" podStartSLOduration=4.165256906 podStartE2EDuration="45.810005105s" podCreationTimestamp="2026-03-13 20:47:20 +0000 UTC" firstStartedPulling="2026-03-13 20:47:22.341598897 +0000 UTC m=+1173.362714788" lastFinishedPulling="2026-03-13 20:48:03.986347096 +0000 UTC m=+1215.007462987" observedRunningTime="2026-03-13 20:48:05.79879838 +0000 UTC m=+1216.819914281" watchObservedRunningTime="2026-03-13 20:48:05.810005105 +0000 UTC m=+1216.831120996" Mar 13 20:48:06 crc kubenswrapper[4790]: I0313 20:48:06.741637 4790 generic.go:334] "Generic (PLEG): container finished" podID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerID="3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a" exitCode=0 Mar 13 20:48:06 crc kubenswrapper[4790]: I0313 20:48:06.741785 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerDied","Data":"3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a"} Mar 13 20:48:07 crc kubenswrapper[4790]: I0313 20:48:07.761237 4790 generic.go:334] "Generic (PLEG): container finished" podID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerID="062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a" exitCode=0 Mar 13 20:48:07 crc kubenswrapper[4790]: I0313 20:48:07.762231 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerDied","Data":"062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a"} Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.158651 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.303012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.324357 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9" (OuterVolumeSpecName: "kube-api-access-tc4x9") pod "eda4da8c-f54a-4c25-9669-ff180aa0b9a9" (UID: "eda4da8c-f54a-4c25-9669-ff180aa0b9a9"). InnerVolumeSpecName "kube-api-access-tc4x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.405841 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.776332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerDied","Data":"f77f483d75213eae4864a3d19aa92203b67c406a1011bada9c9ab22419c8844d"} Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.776366 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.776401 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f77f483d75213eae4864a3d19aa92203b67c406a1011bada9c9ab22419c8844d" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.359231 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.370761 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.433578 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.528093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"5dff6930-5d07-4df7-8d42-470ae83afd38\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.528224 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"5dff6930-5d07-4df7-8d42-470ae83afd38\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.528252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"5dff6930-5d07-4df7-8d42-470ae83afd38\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.546678 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf" (OuterVolumeSpecName: "kube-api-access-tw2bf") pod "5dff6930-5d07-4df7-8d42-470ae83afd38" (UID: "5dff6930-5d07-4df7-8d42-470ae83afd38"). InnerVolumeSpecName "kube-api-access-tw2bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.546780 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5dff6930-5d07-4df7-8d42-470ae83afd38" (UID: "5dff6930-5d07-4df7-8d42-470ae83afd38"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.580506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dff6930-5d07-4df7-8d42-470ae83afd38" (UID: "5dff6930-5d07-4df7-8d42-470ae83afd38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.630759 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.630802 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.630816 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.675582 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" path="/var/lib/kubelet/pods/6027d153-5f8e-4bb1-8275-9a8df8c533f2/volumes" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.790163 4790 generic.go:334] "Generic (PLEG): container finished" podID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerID="f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9" exitCode=0 Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.791139 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerDied","Data":"f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9"} Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.802269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerDied","Data":"66bbb0c4358595b69723c41e22c295dc43c704b8a97a66ffe918a90a7b96cb73"} Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.802355 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66bbb0c4358595b69723c41e22c295dc43c704b8a97a66ffe918a90a7b96cb73" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.802531 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.230267 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-798f469b5d-gs7bt"] Mar 13 20:48:10 crc kubenswrapper[4790]: E0313 20:48:10.231091 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerName="oc" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231169 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerName="oc" Mar 13 20:48:10 crc kubenswrapper[4790]: E0313 20:48:10.231252 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerName="barbican-db-sync" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231312 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerName="barbican-db-sync" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231557 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerName="barbican-db-sync" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231641 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerName="oc" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.232658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.239824 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2zqc7" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.240226 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.240403 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.245656 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d9ddc9bbc-tg88r"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.247055 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.257898 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.273177 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-798f469b5d-gs7bt"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.324789 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d9ddc9bbc-tg88r"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-combined-ca-bundle\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345941 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f92730-30b3-4583-ab7c-258c0a0880a2-logs\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f859x\" (UniqueName: \"kubernetes.io/projected/8a191811-ef81-4066-bcbb-0385c9258fc0-kube-api-access-f859x\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-combined-ca-bundle\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data-custom\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346050 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpr6b\" (UniqueName: \"kubernetes.io/projected/98f92730-30b3-4583-ab7c-258c0a0880a2-kube-api-access-vpr6b\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a191811-ef81-4066-bcbb-0385c9258fc0-logs\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346173 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data-custom\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.422349 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.423844 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.435552 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.447778 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448082 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data-custom\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-combined-ca-bundle\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448459 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f92730-30b3-4583-ab7c-258c0a0880a2-logs\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f859x\" (UniqueName: \"kubernetes.io/projected/8a191811-ef81-4066-bcbb-0385c9258fc0-kube-api-access-f859x\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-combined-ca-bundle\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data-custom\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448796 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpr6b\" (UniqueName: \"kubernetes.io/projected/98f92730-30b3-4583-ab7c-258c0a0880a2-kube-api-access-vpr6b\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a191811-ef81-4066-bcbb-0385c9258fc0-logs\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.450601 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f92730-30b3-4583-ab7c-258c0a0880a2-logs\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.451269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a191811-ef81-4066-bcbb-0385c9258fc0-logs\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.457948 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.462617 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-combined-ca-bundle\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.464632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.466388 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data-custom\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.470513 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-combined-ca-bundle\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.478845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data-custom\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.483062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f859x\" (UniqueName: \"kubernetes.io/projected/8a191811-ef81-4066-bcbb-0385c9258fc0-kube-api-access-f859x\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.493250 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpr6b\" (UniqueName: \"kubernetes.io/projected/98f92730-30b3-4583-ab7c-258c0a0880a2-kube-api-access-vpr6b\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.500170 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.513996 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-686b857b8-6fghv" podUID="d0f5105d-51ea-4e5e-832f-8302188a943a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551002 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551082 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551179 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551195 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.555079 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.556678 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.561472 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.587798 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.603100 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.637526 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655111 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655183 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655364 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655478 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655571 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655797 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655860 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.656548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.656970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.657322 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.657723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.657923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.673819 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757786 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757822 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.764184 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.764242 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.766349 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.766807 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.779207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.796572 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.880984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.021050 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c6887dbdb-wnl4x"] Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.022707 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.028691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.028903 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.036200 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6887dbdb-wnl4x"] Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.196730 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-public-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197408 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-logs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-internal-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197643 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-combined-ca-bundle\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197697 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data-custom\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbp2\" (UniqueName: \"kubernetes.io/projected/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-kube-api-access-snbp2\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.298926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-public-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-logs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299227 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-internal-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-combined-ca-bundle\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data-custom\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299316 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbp2\" (UniqueName: \"kubernetes.io/projected/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-kube-api-access-snbp2\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.305730 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-logs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.309911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data-custom\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.315539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.315980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-public-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.320532 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-combined-ca-bundle\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.323986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-internal-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.326453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbp2\" (UniqueName: \"kubernetes.io/projected/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-kube-api-access-snbp2\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.363698 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.015538 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.015823 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.015863 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.016325 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.016390 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c" gracePeriod=600 Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.878963 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c" exitCode=0 Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.879023 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c"} Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.879289 4790 scope.go:117] "RemoveContainer" containerID="1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.004342 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132309 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132459 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132483 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132550 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132768 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.141484 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.141573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq" (OuterVolumeSpecName: "kube-api-access-8mdhq") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "kube-api-access-8mdhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.144493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts" (OuterVolumeSpecName: "scripts") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.186879 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data" (OuterVolumeSpecName: "config-data") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.189159 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234875 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234911 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234921 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234930 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234940 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234949 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: E0313 20:48:15.843755 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.890071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerDied","Data":"593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd"} Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.890360 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.890521 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893304 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4"} Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893453 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" containerID="cri-o://0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" gracePeriod=30 Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893652 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893919 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" containerID="cri-o://8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" gracePeriod=30 Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893960 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" containerID="cri-o://fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" gracePeriod=30 Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.895766 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb"} Mar 13 20:48:16 crc kubenswrapper[4790]: W0313 20:48:16.100560 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc5e5f2f_999a_4ae6_82f1_d5942a570a3e.slice/crio-ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43 WatchSource:0}: Error finding container ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43: Status 404 returned error can't find the container with id ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.100646 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6887dbdb-wnl4x"] Mar 13 20:48:16 crc kubenswrapper[4790]: W0313 20:48:16.103150 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07188cb9_d2b0_4923_a90c_386eb3525476.slice/crio-4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9 WatchSource:0}: Error finding container 4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9: Status 404 returned error can't find the container with id 4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.112565 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.294654 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: E0313 20:48:16.295349 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerName="cinder-db-sync" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.295367 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerName="cinder-db-sync" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.295567 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerName="cinder-db-sync" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.296724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301311 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9qb6s" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301531 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301558 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301477 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.319143 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.350946 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.367121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368590 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368766 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368951 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.369041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.407467 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.409258 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.424228 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472767 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472802 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472820 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472894 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.473000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.487759 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.489017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.493357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.494472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.517480 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.524782 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.547434 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-798f469b5d-gs7bt"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.556858 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d9ddc9bbc-tg88r"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.575462 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.575531 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.576007 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.576155 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.576992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.600516 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.602151 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.604563 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.608772 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.626856 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.661596 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.675979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676569 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676618 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676751 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.775934 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780404 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780476 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780495 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780568 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780594 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780988 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.784698 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.784836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.785979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.788472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.794668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.808218 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.927336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" event={"ID":"98f92730-30b3-4583-ab7c-258c0a0880a2","Type":"ContainerStarted","Data":"f42b1074404aaa07048c55aad06d728bced62837ddf9c7379f463379b1627e10"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.937671 4790 generic.go:334] "Generic (PLEG): container finished" podID="07188cb9-d2b0-4923-a90c-386eb3525476" containerID="0a4d05e5a2a7ca428f12a479e9146ff2c5a49d668908d72ddf47d5d430d6d357" exitCode=0 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.937789 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" event={"ID":"07188cb9-d2b0-4923-a90c-386eb3525476","Type":"ContainerDied","Data":"0a4d05e5a2a7ca428f12a479e9146ff2c5a49d668908d72ddf47d5d430d6d357"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.937822 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" event={"ID":"07188cb9-d2b0-4923-a90c-386eb3525476","Type":"ContainerStarted","Data":"4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.941776 4790 generic.go:334] "Generic (PLEG): container finished" podID="1abdfade-817b-4659-b8be-48bb516fb866" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" exitCode=2 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.941862 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.943966 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6887dbdb-wnl4x" event={"ID":"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e","Type":"ContainerStarted","Data":"b2b8ea66f4e218bdde854ae7f7245f16743983cc1e396a879d914f731601eb3d"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.943991 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6887dbdb-wnl4x" event={"ID":"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e","Type":"ContainerStarted","Data":"ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.945635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerStarted","Data":"8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.945661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerStarted","Data":"d42a77ccb8df8e4551bc2d02ca8dcf98b96ca45a3c404d603bdd2962aa71b56a"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.947662 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" event={"ID":"8a191811-ef81-4066-bcbb-0385c9258fc0","Type":"ContainerStarted","Data":"9fe1c8529fe60428074588ddddc06140144982d7436f19bdadea3c344503eb7a"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.053280 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.247590 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:17 crc kubenswrapper[4790]: W0313 20:48:17.263997 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd8df37_b60e_4ef1_9b53_6a59ba59e538.slice/crio-356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030 WatchSource:0}: Error finding container 356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030: Status 404 returned error can't find the container with id 356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030 Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.269592 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.359026 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.460711 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527179 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527235 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527368 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527821 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.541699 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv" (OuterVolumeSpecName: "kube-api-access-lcrnv") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "kube-api-access-lcrnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.560535 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config" (OuterVolumeSpecName: "config") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.563826 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.564404 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.569693 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.580584 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634430 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634491 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634505 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634518 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634531 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634543 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.760310 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.957646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerStarted","Data":"8575cdd5bb4666a6bc6bc6c42910b6563764f091451eed83947cc6f64da3a0eb"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.963628 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerStarted","Data":"356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.974188 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.974292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" event={"ID":"07188cb9-d2b0-4923-a90c-386eb3525476","Type":"ContainerDied","Data":"4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.974362 4790 scope.go:117] "RemoveContainer" containerID="0a4d05e5a2a7ca428f12a479e9146ff2c5a49d668908d72ddf47d5d430d6d357" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.987742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerStarted","Data":"90cf908fda5bfa83deaae1fd0eac95ba601f9eb9da62b0fab2c3af0677ac98b2"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.991630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6887dbdb-wnl4x" event={"ID":"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e","Type":"ContainerStarted","Data":"f1c4acbc184c6d684bf1f645d51e17a734a58a06beda3e715c2df4b5b76fdda8"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.991730 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.991757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.999612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerStarted","Data":"17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360"} Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.000210 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.000268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.030680 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.038791 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.064820 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c748666b-tvhxb" podStartSLOduration=8.064771729 podStartE2EDuration="8.064771729s" podCreationTimestamp="2026-03-13 20:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:18.044032853 +0000 UTC m=+1229.065148744" watchObservedRunningTime="2026-03-13 20:48:18.064771729 +0000 UTC m=+1229.085887640" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.089785 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c6887dbdb-wnl4x" podStartSLOduration=6.08975893 podStartE2EDuration="6.08975893s" podCreationTimestamp="2026-03-13 20:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:18.077913327 +0000 UTC m=+1229.099029218" watchObservedRunningTime="2026-03-13 20:48:18.08975893 +0000 UTC m=+1229.110874831" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.335298 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.017343 4790 generic.go:334] "Generic (PLEG): container finished" podID="1abdfade-817b-4659-b8be-48bb516fb866" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" exitCode=0 Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.017512 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c"} Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.026459 4790 generic.go:334] "Generic (PLEG): container finished" podID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerID="54996574df6debfb6f3430b43b232f15654c266d463f051ee19ed34e62244f6c" exitCode=0 Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.026570 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerDied","Data":"54996574df6debfb6f3430b43b232f15654c266d463f051ee19ed34e62244f6c"} Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.030627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerStarted","Data":"76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca"} Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.669346 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" path="/var/lib/kubelet/pods/07188cb9-d2b0-4923-a90c-386eb3525476/volumes" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.060830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" event={"ID":"98f92730-30b3-4583-ab7c-258c0a0880a2","Type":"ContainerStarted","Data":"4c236a9371cc59f1b41e2ac193fdd9dcd75b79349c6c12f2a9599d496f917fcc"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.061112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" event={"ID":"98f92730-30b3-4583-ab7c-258c0a0880a2","Type":"ContainerStarted","Data":"e13d8c800ad8f3982a06147ccc8adf9e99e1133e8118d3665ddf15123e344c2f"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.073087 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerStarted","Data":"b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.079436 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerStarted","Data":"716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.080712 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.082582 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" podStartSLOduration=7.636768565 podStartE2EDuration="10.082556633s" podCreationTimestamp="2026-03-13 20:48:10 +0000 UTC" firstStartedPulling="2026-03-13 20:48:16.563838932 +0000 UTC m=+1227.584954823" lastFinishedPulling="2026-03-13 20:48:19.009627 +0000 UTC m=+1230.030742891" observedRunningTime="2026-03-13 20:48:20.078363469 +0000 UTC m=+1231.099479360" watchObservedRunningTime="2026-03-13 20:48:20.082556633 +0000 UTC m=+1231.103672524" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.089683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" event={"ID":"8a191811-ef81-4066-bcbb-0385c9258fc0","Type":"ContainerStarted","Data":"7191bb7876683751358d6549cd6e52fa76a26e1517da6569114a61bf3468e91f"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.089742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" event={"ID":"8a191811-ef81-4066-bcbb-0385c9258fc0","Type":"ContainerStarted","Data":"f14c639564d76bb3597d69ed741ad05df72a17b061329b6b4b5e69fea9429dd4"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerStarted","Data":"0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102235 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" containerID="cri-o://76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca" gracePeriod=30 Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102301 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" containerID="cri-o://0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7" gracePeriod=30 Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102357 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.113456 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" podStartSLOduration=4.113433866 podStartE2EDuration="4.113433866s" podCreationTimestamp="2026-03-13 20:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:20.111993886 +0000 UTC m=+1231.133109787" watchObservedRunningTime="2026-03-13 20:48:20.113433866 +0000 UTC m=+1231.134549777" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.140190 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.140172084 podStartE2EDuration="4.140172084s" podCreationTimestamp="2026-03-13 20:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:20.136109874 +0000 UTC m=+1231.157225775" watchObservedRunningTime="2026-03-13 20:48:20.140172084 +0000 UTC m=+1231.161287975" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.114996 4790 generic.go:334] "Generic (PLEG): container finished" podID="de946747-1160-46da-bacd-7ac005e29c73" containerID="0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7" exitCode=0 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.115257 4790 generic.go:334] "Generic (PLEG): container finished" podID="de946747-1160-46da-bacd-7ac005e29c73" containerID="76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca" exitCode=143 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.115098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerDied","Data":"0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7"} Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.115423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerDied","Data":"76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca"} Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.200679 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.218316 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" podStartSLOduration=8.773690594 podStartE2EDuration="11.21829477s" podCreationTimestamp="2026-03-13 20:48:10 +0000 UTC" firstStartedPulling="2026-03-13 20:48:16.563538493 +0000 UTC m=+1227.584654374" lastFinishedPulling="2026-03-13 20:48:19.008142659 +0000 UTC m=+1230.029258550" observedRunningTime="2026-03-13 20:48:20.164334074 +0000 UTC m=+1231.185449975" watchObservedRunningTime="2026-03-13 20:48:21.21829477 +0000 UTC m=+1232.239410661" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.493363 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.493925 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" containerID="cri-o://9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0" gracePeriod=30 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.494032 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" containerID="cri-o://abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66" gracePeriod=30 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.529217 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77f687ff4f-d7b7z"] Mar 13 20:48:21 crc kubenswrapper[4790]: E0313 20:48:21.529623 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" containerName="init" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.529645 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" containerName="init" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.529835 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" containerName="init" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.530711 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.550519 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77f687ff4f-d7b7z"] Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.551775 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": EOF" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-public-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610183 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-httpd-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610222 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-combined-ca-bundle\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-internal-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610650 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-ovndb-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdkt\" (UniqueName: \"kubernetes.io/projected/6c6f5d56-217d-441e-8771-503fd5e681fb-kube-api-access-gfdkt\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712342 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdkt\" (UniqueName: \"kubernetes.io/projected/6c6f5d56-217d-441e-8771-503fd5e681fb-kube-api-access-gfdkt\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-public-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-httpd-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.713315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-combined-ca-bundle\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.713350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-internal-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.713445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-ovndb-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.718492 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-internal-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.718746 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-public-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.719453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-httpd-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.719664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-combined-ca-bundle\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.720083 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-ovndb-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.725467 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.742145 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdkt\" (UniqueName: \"kubernetes.io/projected/6c6f5d56-217d-441e-8771-503fd5e681fb-kube-api-access-gfdkt\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.851909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.142036 4790 generic.go:334] "Generic (PLEG): container finished" podID="ebc86632-179c-403a-bbdd-d496a21c018c" containerID="abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66" exitCode=0 Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.142477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerDied","Data":"abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66"} Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.153635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerStarted","Data":"ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878"} Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.190902 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.46196809 podStartE2EDuration="6.190886137s" podCreationTimestamp="2026-03-13 20:48:16 +0000 UTC" firstStartedPulling="2026-03-13 20:48:17.269306222 +0000 UTC m=+1228.290422123" lastFinishedPulling="2026-03-13 20:48:18.998224279 +0000 UTC m=+1230.019340170" observedRunningTime="2026-03-13 20:48:22.188180393 +0000 UTC m=+1233.209296284" watchObservedRunningTime="2026-03-13 20:48:22.190886137 +0000 UTC m=+1233.212002028" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.409398 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.528125 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.529455 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531074 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs" (OuterVolumeSpecName: "logs") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531246 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531553 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531701 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.533932 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.537398 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.537650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m" (OuterVolumeSpecName: "kube-api-access-k8w5m") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "kube-api-access-k8w5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.537800 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts" (OuterVolumeSpecName: "scripts") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.596477 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data" (OuterVolumeSpecName: "config-data") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.602659 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637615 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637656 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637667 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637675 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637689 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637700 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637713 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.695103 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77f687ff4f-d7b7z"] Mar 13 20:48:22 crc kubenswrapper[4790]: W0313 20:48:22.701932 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c6f5d56_217d_441e_8771_503fd5e681fb.slice/crio-80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872 WatchSource:0}: Error finding container 80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872: Status 404 returned error can't find the container with id 80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872 Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.838215 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.915794 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.036012 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.149877 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150359 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.152573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs" (OuterVolumeSpecName: "logs") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.155172 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.166732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz" (OuterVolumeSpecName: "kube-api-access-bdqgz") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "kube-api-access-bdqgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172739 4790 generic.go:334] "Generic (PLEG): container finished" podID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" exitCode=137 Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172774 4790 generic.go:334] "Generic (PLEG): container finished" podID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" exitCode=137 Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerDied","Data":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerDied","Data":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerDied","Data":"0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172860 4790 scope.go:117] "RemoveContainer" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.173053 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.182118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts" (OuterVolumeSpecName: "scripts") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.198940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerDied","Data":"8575cdd5bb4666a6bc6bc6c42910b6563764f091451eed83947cc6f64da3a0eb"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.199062 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.199256 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data" (OuterVolumeSpecName: "config-data") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.213567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77f687ff4f-d7b7z" event={"ID":"6c6f5d56-217d-441e-8771-503fd5e681fb","Type":"ContainerStarted","Data":"430ecc8dc6d2c3b6e5e4d4a5d13a83f7c8f4e9ef0f8f462eadd32e1e37375e29"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.213610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77f687ff4f-d7b7z" event={"ID":"6c6f5d56-217d-441e-8771-503fd5e681fb","Type":"ContainerStarted","Data":"80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.243334 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252286 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252327 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252339 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252410 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252421 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.255481 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264306 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264716 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264734 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264744 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264750 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264769 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264775 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264787 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264792 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264987 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.265017 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.265026 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.265039 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.268101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.312895 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.313315 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.314000 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.314120 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c42a2a27-f7c5-463b-982a-4dafcac978ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42a2a27-f7c5-463b-982a-4dafcac978ad-logs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-scripts\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366728 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5fv2\" (UniqueName: \"kubernetes.io/projected/c42a2a27-f7c5-463b-982a-4dafcac978ad-kube-api-access-j5fv2\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.376575 4790 scope.go:117] "RemoveContainer" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5fv2\" (UniqueName: \"kubernetes.io/projected/c42a2a27-f7c5-463b-982a-4dafcac978ad-kube-api-access-j5fv2\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c42a2a27-f7c5-463b-982a-4dafcac978ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42a2a27-f7c5-463b-982a-4dafcac978ad-logs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469674 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-scripts\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.473841 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.474103 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42a2a27-f7c5-463b-982a-4dafcac978ad-logs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.474394 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c42a2a27-f7c5-463b-982a-4dafcac978ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.480414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.484397 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-scripts\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.484723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.485898 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.487975 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.496459 4790 scope.go:117] "RemoveContainer" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.498799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5fv2\" (UniqueName: \"kubernetes.io/projected/c42a2a27-f7c5-463b-982a-4dafcac978ad-kube-api-access-j5fv2\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.504321 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": container with ID starting with 45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d not found: ID does not exist" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.504365 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} err="failed to get container status \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": rpc error: code = NotFound desc = could not find container \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": container with ID starting with 45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.504401 4790 scope.go:117] "RemoveContainer" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.505037 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": container with ID starting with f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488 not found: ID does not exist" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505063 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} err="failed to get container status \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": rpc error: code = NotFound desc = could not find container \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": container with ID starting with f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488 not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505077 4790 scope.go:117] "RemoveContainer" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505547 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} err="failed to get container status \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": rpc error: code = NotFound desc = could not find container \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": container with ID starting with 45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505577 4790 scope.go:117] "RemoveContainer" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.506752 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} err="failed to get container status \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": rpc error: code = NotFound desc = could not find container \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": container with ID starting with f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488 not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.506802 4790 scope.go:117] "RemoveContainer" containerID="0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.544843 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.546533 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.557599 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.562201 4790 scope.go:117] "RemoveContainer" containerID="76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.612205 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.678989 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" path="/var/lib/kubelet/pods/ccb1b2e8-4b05-411b-a540-6507fdd5775f/volumes" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.680262 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de946747-1160-46da-bacd-7ac005e29c73" path="/var/lib/kubelet/pods/de946747-1160-46da-bacd-7ac005e29c73/volumes" Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.092631 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:24 crc kubenswrapper[4790]: W0313 20:48:24.095612 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc42a2a27_f7c5_463b_982a_4dafcac978ad.slice/crio-bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda WatchSource:0}: Error finding container bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda: Status 404 returned error can't find the container with id bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.242244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77f687ff4f-d7b7z" event={"ID":"6c6f5d56-217d-441e-8771-503fd5e681fb","Type":"ContainerStarted","Data":"8e4d8d580c802f5fa347b1de5b751755e61399b2b76ba7aa9c6d3026313afeaa"} Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.242636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.246856 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c42a2a27-f7c5-463b-982a-4dafcac978ad","Type":"ContainerStarted","Data":"bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda"} Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.266617 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77f687ff4f-d7b7z" podStartSLOduration=3.266599091 podStartE2EDuration="3.266599091s" podCreationTimestamp="2026-03-13 20:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:24.261609724 +0000 UTC m=+1235.282725635" watchObservedRunningTime="2026-03-13 20:48:24.266599091 +0000 UTC m=+1235.287714982" Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.748526 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.121096 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.203204 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.233369 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.297346 4790 generic.go:334] "Generic (PLEG): container finished" podID="ebc86632-179c-403a-bbdd-d496a21c018c" containerID="9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0" exitCode=0 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.297553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerDied","Data":"9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0"} Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.323190 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c42a2a27-f7c5-463b-982a-4dafcac978ad","Type":"ContainerStarted","Data":"a7466c9393a2e7ac2f9a9cc84c569eed3edbc93a91ed18f031c1147c7828cd61"} Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.325150 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" containerID="cri-o://75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.325175 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" containerID="cri-o://59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.772973 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.817268 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.856443 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.856713 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" containerID="cri-o://8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.857582 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" containerID="cri-o://17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920781 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920818 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.921092 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.921127 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.950471 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.973997 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b" (OuterVolumeSpecName: "kube-api-access-9968b") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "kube-api-access-9968b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.023410 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.023480 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.061081 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config" (OuterVolumeSpecName: "config") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.079859 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.082658 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.086492 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.110516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126783 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126814 4790 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126823 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126832 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126840 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.331909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c42a2a27-f7c5-463b-982a-4dafcac978ad","Type":"ContainerStarted","Data":"c76b91edd8bb8101c227d42c9cd4c199b919c4c67e9a68dc22a5274872cc82c7"} Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.333208 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.333936 4790 generic.go:334] "Generic (PLEG): container finished" podID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerID="8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581" exitCode=143 Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.334019 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerDied","Data":"8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581"} Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.335773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerDied","Data":"84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc"} Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.335815 4790 scope.go:117] "RemoveContainer" containerID="abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.335828 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.364728 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.364703656 podStartE2EDuration="3.364703656s" podCreationTimestamp="2026-03-13 20:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:26.36232103 +0000 UTC m=+1237.383436921" watchObservedRunningTime="2026-03-13 20:48:26.364703656 +0000 UTC m=+1237.385819547" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.391436 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.400219 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.400580 4790 scope.go:117] "RemoveContainer" containerID="9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.662212 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.779258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.873630 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.874422 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" containerID="cri-o://de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8" gracePeriod=10 Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.131880 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.349273 4790 generic.go:334] "Generic (PLEG): container finished" podID="6792eda6-a284-42ab-a650-f21b012f7f44" containerID="de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8" exitCode=0 Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.349370 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerDied","Data":"de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8"} Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.405784 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.554867 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653736 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653829 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653958 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653986 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.654085 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.667425 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz" (OuterVolumeSpecName: "kube-api-access-6xvqz") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "kube-api-access-6xvqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.677058 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" path="/var/lib/kubelet/pods/ebc86632-179c-403a-bbdd-d496a21c018c/volumes" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.703898 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.707037 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.710752 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.717099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config" (OuterVolumeSpecName: "config") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.728340 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756445 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756491 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756510 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756523 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756534 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756544 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.008331 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.064724 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.363349 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" containerID="cri-o://b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9" gracePeriod=30 Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.363841 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.363865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerDied","Data":"eb26cba5d4f1f28cf0c444cc204a575c1f7bb95d1f8f9337a19506bba53fe819"} Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.364083 4790 scope.go:117] "RemoveContainer" containerID="de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.364180 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" containerID="cri-o://ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878" gracePeriod=30 Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.395967 4790 scope.go:117] "RemoveContainer" containerID="c85d717e10fb599c6b3d50e3cfc797654ac9faa539262c4f8824bda9117967e3" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.415850 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.425447 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.379459 4790 generic.go:334] "Generic (PLEG): container finished" podID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" exitCode=0 Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.379721 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerDied","Data":"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69"} Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.385176 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerID="ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878" exitCode=0 Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.385358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerDied","Data":"ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878"} Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.673268 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" path="/var/lib/kubelet/pods/6792eda6-a284-42ab-a650-f21b012f7f44/volumes" Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.710044 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.847064 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058034 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-854ddc4bd-b4ws7"] Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058614 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058638 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058660 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="init" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058668 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="init" Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058678 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058686 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058706 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058715 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058979 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058998 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.059008 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.060163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.069781 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-854ddc4bd-b4ws7"] Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b14c1738-5e9e-4810-b926-5b05af9ec22d-logs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223836 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-public-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223859 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-scripts\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-combined-ca-bundle\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.224072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zdh7\" (UniqueName: \"kubernetes.io/projected/b14c1738-5e9e-4810-b926-5b05af9ec22d-kube-api-access-6zdh7\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.224188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-config-data\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.224424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-internal-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-internal-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325687 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b14c1738-5e9e-4810-b926-5b05af9ec22d-logs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-public-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-scripts\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325792 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-combined-ca-bundle\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325853 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zdh7\" (UniqueName: \"kubernetes.io/projected/b14c1738-5e9e-4810-b926-5b05af9ec22d-kube-api-access-6zdh7\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-config-data\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.326295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b14c1738-5e9e-4810-b926-5b05af9ec22d-logs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.332050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-config-data\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.332052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-internal-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.335631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-scripts\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.339804 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-public-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.345691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-combined-ca-bundle\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.349109 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zdh7\" (UniqueName: \"kubernetes.io/projected/b14c1738-5e9e-4810-b926-5b05af9ec22d-kube-api-access-6zdh7\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.382425 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.429772 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:30 crc kubenswrapper[4790]: W0313 20:48:30.863256 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb14c1738_5e9e_4810_b926_5b05af9ec22d.slice/crio-c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8 WatchSource:0}: Error finding container c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8: Status 404 returned error can't find the container with id c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8 Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.863584 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-854ddc4bd-b4ws7"] Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.276664 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:46044->10.217.0.165:9311: read: connection reset by peer" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.277166 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.277218 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:46048->10.217.0.165:9311: read: connection reset by peer" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.277523 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.422865 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerID="b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9" exitCode=0 Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.423117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerDied","Data":"b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.430540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854ddc4bd-b4ws7" event={"ID":"b14c1738-5e9e-4810-b926-5b05af9ec22d","Type":"ContainerStarted","Data":"0bb4c167ba356dbccb82728022adb0fe6fcad84e2e8b4fa3e49ed62833eda460"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.430602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854ddc4bd-b4ws7" event={"ID":"b14c1738-5e9e-4810-b926-5b05af9ec22d","Type":"ContainerStarted","Data":"f97b4ab3b27494d831c83dee4c542c033948187c65aed470e1f59fca6513f8ec"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.430616 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854ddc4bd-b4ws7" event={"ID":"b14c1738-5e9e-4810-b926-5b05af9ec22d","Type":"ContainerStarted","Data":"c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.431081 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.431509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.433987 4790 generic.go:334] "Generic (PLEG): container finished" podID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerID="17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360" exitCode=0 Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.434042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerDied","Data":"17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.456901 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-854ddc4bd-b4ws7" podStartSLOduration=1.456876493 podStartE2EDuration="1.456876493s" podCreationTimestamp="2026-03-13 20:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:31.451505646 +0000 UTC m=+1242.472621547" watchObservedRunningTime="2026-03-13 20:48:31.456876493 +0000 UTC m=+1242.477992394" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.708962 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.825742 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.854933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855277 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855314 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855363 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855414 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855465 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.856099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.862206 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.863250 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt" (OuterVolumeSpecName: "kube-api-access-srclt") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "kube-api-access-srclt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.875407 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts" (OuterVolumeSpecName: "scripts") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.928150 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957054 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957114 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957279 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957315 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957433 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957770 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs" (OuterVolumeSpecName: "logs") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957799 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957849 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957861 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957871 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957880 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.961575 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl" (OuterVolumeSpecName: "kube-api-access-d2mvl") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "kube-api-access-d2mvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.962360 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.975517 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data" (OuterVolumeSpecName: "config-data") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.985036 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.005481 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data" (OuterVolumeSpecName: "config-data") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061828 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061871 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061885 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061896 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061911 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061921 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.443910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerDied","Data":"d42a77ccb8df8e4551bc2d02ca8dcf98b96ca45a3c404d603bdd2962aa71b56a"} Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.443965 4790 scope.go:117] "RemoveContainer" containerID="17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.444071 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.451296 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.451480 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerDied","Data":"356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030"} Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.471483 4790 scope.go:117] "RemoveContainer" containerID="8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.485851 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.496516 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.503276 4790 scope.go:117] "RemoveContainer" containerID="ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.504891 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.514049 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.521625 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.521980 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.521997 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.522009 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522015 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.522040 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522046 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.522069 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522076 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522235 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522248 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522261 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522276 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.523144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.526134 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.533720 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.543528 4790 scope.go:117] "RemoveContainer" containerID="b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674896 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9jj\" (UniqueName: \"kubernetes.io/projected/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-kube-api-access-fw9jj\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.675025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.675047 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw9jj\" (UniqueName: \"kubernetes.io/projected/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-kube-api-access-fw9jj\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776752 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776866 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.780686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.780810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.781213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.793256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw9jj\" (UniqueName: \"kubernetes.io/projected/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-kube-api-access-fw9jj\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.803973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.857582 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.317156 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:33 crc kubenswrapper[4790]: W0313 20:48:33.319967 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ccdc6f2_f911_48c1_b8a8_dc6f2054fed5.slice/crio-23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046 WatchSource:0}: Error finding container 23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046: Status 404 returned error can't find the container with id 23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046 Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.479876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5","Type":"ContainerStarted","Data":"23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046"} Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.671429 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" path="/var/lib/kubelet/pods/7dd8df37-b60e-4ef1-9b53-6a59ba59e538/volumes" Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.672618 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" path="/var/lib/kubelet/pods/84cf2aee-27d9-4022-8c67-55840b2faedd/volumes" Mar 13 20:48:34 crc kubenswrapper[4790]: I0313 20:48:34.489180 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5","Type":"ContainerStarted","Data":"5c2d6b68291f7ff6741dc13b76e0434d10b189856a64a8b04fbdb7c15279b680"} Mar 13 20:48:34 crc kubenswrapper[4790]: I0313 20:48:34.489706 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5","Type":"ContainerStarted","Data":"5319bb6a60d6fcc768d905176c5ccd9cc1fe41c0f21dd6d023c85919e06fcc93"} Mar 13 20:48:34 crc kubenswrapper[4790]: I0313 20:48:34.511347 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.511331181 podStartE2EDuration="2.511331181s" podCreationTimestamp="2026-03-13 20:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:34.508233107 +0000 UTC m=+1245.529348998" watchObservedRunningTime="2026-03-13 20:48:34.511331181 +0000 UTC m=+1245.532447072" Mar 13 20:48:35 crc kubenswrapper[4790]: I0313 20:48:35.454925 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 20:48:36 crc kubenswrapper[4790]: I0313 20:48:36.447765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:37 crc kubenswrapper[4790]: I0313 20:48:37.858361 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.428785 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.767167 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.768516 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.770713 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.772717 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.782231 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-d69w2" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.784497 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.858485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.858739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.858785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjl6\" (UniqueName: \"kubernetes.io/projected/7f0237c2-5c72-4776-9226-67244abca8dd-kube-api-access-pfjl6\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.859095 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961602 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961737 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjl6\" (UniqueName: \"kubernetes.io/projected/7f0237c2-5c72-4776-9226-67244abca8dd-kube-api-access-pfjl6\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.962352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.970263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.970453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.980789 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjl6\" (UniqueName: \"kubernetes.io/projected/7f0237c2-5c72-4776-9226-67244abca8dd-kube-api-access-pfjl6\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:41 crc kubenswrapper[4790]: I0313 20:48:41.145003 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 20:48:41 crc kubenswrapper[4790]: I0313 20:48:41.640795 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 20:48:42 crc kubenswrapper[4790]: I0313 20:48:42.566165 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7f0237c2-5c72-4776-9226-67244abca8dd","Type":"ContainerStarted","Data":"abc967605bc85a141593aed19b10621514cf7f5701c183acf4e6e90950ad49e9"} Mar 13 20:48:43 crc kubenswrapper[4790]: I0313 20:48:43.180955 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.929964 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-798495789f-5fvw5"] Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.939417 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.942126 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.942247 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.942416 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.948603 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-798495789f-5fvw5"] Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-run-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039336 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-config-data\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-combined-ca-bundle\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5gjn\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-kube-api-access-z5gjn\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-log-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039580 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-public-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-etc-swift\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-internal-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-config-data\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-combined-ca-bundle\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141139 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5gjn\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-kube-api-access-z5gjn\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-log-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-public-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-etc-swift\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-internal-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-run-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-log-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.142015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-run-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.149565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-public-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.149565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-internal-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.152232 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-combined-ca-bundle\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.153031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-etc-swift\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.158272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-config-data\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.164951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5gjn\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-kube-api-access-z5gjn\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.279010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.949787 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-798495789f-5fvw5"] Mar 13 20:48:45 crc kubenswrapper[4790]: W0313 20:48:45.997323 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d498924_f84f_48aa_b971_b58cbea48295.slice/crio-e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a WatchSource:0}: Error finding container e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a: Status 404 returned error can't find the container with id e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.535541 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592296 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592422 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592488 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592660 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592705 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.594478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.595323 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.603186 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79" (OuterVolumeSpecName: "kube-api-access-bdq79") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "kube-api-access-bdq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.603494 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts" (OuterVolumeSpecName: "scripts") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.625838 4790 generic.go:334] "Generic (PLEG): container finished" podID="1abdfade-817b-4659-b8be-48bb516fb866" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" exitCode=137 Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.626141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.630980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.631055 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.631079 4790 scope.go:117] "RemoveContainer" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.641135 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-798495789f-5fvw5" event={"ID":"7d498924-f84f-48aa-b971-b58cbea48295","Type":"ContainerStarted","Data":"b2473dc84a21b361e755e4dc3b18c75ade1a917e3696e1b6cf9c8ebc083cbfcf"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.641462 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-798495789f-5fvw5" event={"ID":"7d498924-f84f-48aa-b971-b58cbea48295","Type":"ContainerStarted","Data":"e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.679479 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.680366 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.694886 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695090 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695104 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695117 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695127 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695138 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.696519 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data" (OuterVolumeSpecName: "config-data") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.796594 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.988193 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.999745 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.029410 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:47 crc kubenswrapper[4790]: E0313 20:48:47.030347 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.030393 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" Mar 13 20:48:47 crc kubenswrapper[4790]: E0313 20:48:47.030438 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.030448 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" Mar 13 20:48:47 crc kubenswrapper[4790]: E0313 20:48:47.030522 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.030534 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.033807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.033873 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.033903 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.035948 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.040791 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.042010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.046846 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104173 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104330 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104357 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104591 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.206836 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.206929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.206961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207023 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207199 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.212033 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.212069 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.215631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.224743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.225091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.361790 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.689527 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abdfade-817b-4659-b8be-48bb516fb866" path="/var/lib/kubelet/pods/1abdfade-817b-4659-b8be-48bb516fb866/volumes" Mar 13 20:48:49 crc kubenswrapper[4790]: I0313 20:48:49.952149 4790 scope.go:117] "RemoveContainer" containerID="31ce3becbe5f9fc73efb71d7c9c70a67bb2549c4e27e76481e3678501a4317cf" Mar 13 20:48:50 crc kubenswrapper[4790]: I0313 20:48:50.429069 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:50 crc kubenswrapper[4790]: I0313 20:48:50.429447 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.872133 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.971850 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.972392 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fc7fb5bf6-ctr9l" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" containerID="cri-o://449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09" gracePeriod=30 Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.972804 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fc7fb5bf6-ctr9l" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" containerID="cri-o://8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e" gracePeriod=30 Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.016693 4790 scope.go:117] "RemoveContainer" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.127547 4790 scope.go:117] "RemoveContainer" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.236845 4790 scope.go:117] "RemoveContainer" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" Mar 13 20:48:52 crc kubenswrapper[4790]: E0313 20:48:52.238471 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4\": container with ID starting with 8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4 not found: ID does not exist" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.238535 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4"} err="failed to get container status \"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4\": rpc error: code = NotFound desc = could not find container \"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4\": container with ID starting with 8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4 not found: ID does not exist" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.238556 4790 scope.go:117] "RemoveContainer" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" Mar 13 20:48:52 crc kubenswrapper[4790]: E0313 20:48:52.239880 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486\": container with ID starting with fe7297aab5981431006e363000146624b164562815f098000374d6b910719486 not found: ID does not exist" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.239910 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486"} err="failed to get container status \"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486\": rpc error: code = NotFound desc = could not find container \"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486\": container with ID starting with fe7297aab5981431006e363000146624b164562815f098000374d6b910719486 not found: ID does not exist" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.239930 4790 scope.go:117] "RemoveContainer" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" Mar 13 20:48:52 crc kubenswrapper[4790]: E0313 20:48:52.240242 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c\": container with ID starting with 0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c not found: ID does not exist" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.240289 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c"} err="failed to get container status \"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c\": rpc error: code = NotFound desc = could not find container \"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c\": container with ID starting with 0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c not found: ID does not exist" Mar 13 20:48:52 crc kubenswrapper[4790]: W0313 20:48:52.637984 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc306890_4355_4f40_abc0_11753b34d120.slice/crio-024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0 WatchSource:0}: Error finding container 024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0: Status 404 returned error can't find the container with id 024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0 Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.638616 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.772772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.775184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7f0237c2-5c72-4776-9226-67244abca8dd","Type":"ContainerStarted","Data":"ec8012539c5fb52c716a3829d23983c3bc81706c32675695f09bf5f4ba8e2727"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.778342 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-798495789f-5fvw5" event={"ID":"7d498924-f84f-48aa-b971-b58cbea48295","Type":"ContainerStarted","Data":"75d271d47d12592ed78b17cd53ef3de64fc116a56e2a8f41c8439584d19b2524"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.778530 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.781079 4790 generic.go:334] "Generic (PLEG): container finished" podID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerID="8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e" exitCode=0 Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.781145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerDied","Data":"8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.798360 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.229869181 podStartE2EDuration="12.798338369s" podCreationTimestamp="2026-03-13 20:48:40 +0000 UTC" firstStartedPulling="2026-03-13 20:48:41.648028821 +0000 UTC m=+1252.669144713" lastFinishedPulling="2026-03-13 20:48:52.216498 +0000 UTC m=+1263.237613901" observedRunningTime="2026-03-13 20:48:52.789805176 +0000 UTC m=+1263.810921077" watchObservedRunningTime="2026-03-13 20:48:52.798338369 +0000 UTC m=+1263.819454260" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.816734 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-798495789f-5fvw5" podStartSLOduration=8.816719591 podStartE2EDuration="8.816719591s" podCreationTimestamp="2026-03-13 20:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:52.812215718 +0000 UTC m=+1263.833331619" watchObservedRunningTime="2026-03-13 20:48:52.816719591 +0000 UTC m=+1263.837835482" Mar 13 20:48:53 crc kubenswrapper[4790]: I0313 20:48:53.791350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22"} Mar 13 20:48:53 crc kubenswrapper[4790]: I0313 20:48:53.791612 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.142603 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.248176 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.248489 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" containerID="cri-o://4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d" gracePeriod=30 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.248553 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" containerID="cri-o://96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e" gracePeriod=30 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.819749 4790 generic.go:334] "Generic (PLEG): container finished" podID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerID="449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09" exitCode=0 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.819933 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerDied","Data":"449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09"} Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.825695 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerID="4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d" exitCode=143 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.825770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerDied","Data":"4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d"} Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.831207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9"} Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.862834 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878242 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878415 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878490 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878564 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878606 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.889203 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26" (OuterVolumeSpecName: "kube-api-access-xcl26") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "kube-api-access-xcl26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.891264 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.938862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config" (OuterVolumeSpecName: "config") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.955633 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.991408 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.992644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: W0313 20:48:54.992812 4790 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/96f53d5c-8b27-4810-a760-f7c9a4ee567b/volumes/kubernetes.io~secret/ovndb-tls-certs Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.992844 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993461 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993493 4790 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993505 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993520 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993531 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.610600 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.776642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.804901 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.804976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805216 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805264 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805317 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.807317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs" (OuterVolumeSpecName: "logs") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.811522 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k" (OuterVolumeSpecName: "kube-api-access-l6t9k") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "kube-api-access-l6t9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.814855 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.835095 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data" (OuterVolumeSpecName: "config-data") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.846300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts" (OuterVolumeSpecName: "scripts") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.846664 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.846842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerDied","Data":"244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.847102 4790 scope.go:117] "RemoveContainer" containerID="8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851587 4790 generic.go:334] "Generic (PLEG): container finished" podID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" exitCode=137 Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851672 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerDied","Data":"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerDied","Data":"32071f4748bdbdbbb2169f1b2a9fc194d9a40accb2c6784c59874d08e8b9f3b6"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851763 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.857512 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.867145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.886891 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.894669 4790 scope.go:117] "RemoveContainer" containerID="449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.898495 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.904877 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907594 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907627 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907637 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907646 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907656 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907666 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907675 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.912184 4790 scope.go:117] "RemoveContainer" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.075114 4790 scope.go:117] "RemoveContainer" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.097087 4790 scope.go:117] "RemoveContainer" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" Mar 13 20:48:56 crc kubenswrapper[4790]: E0313 20:48:56.097686 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69\": container with ID starting with 59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69 not found: ID does not exist" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.097720 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69"} err="failed to get container status \"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69\": rpc error: code = NotFound desc = could not find container \"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69\": container with ID starting with 59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69 not found: ID does not exist" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.097741 4790 scope.go:117] "RemoveContainer" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" Mar 13 20:48:56 crc kubenswrapper[4790]: E0313 20:48:56.098164 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483\": container with ID starting with 75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483 not found: ID does not exist" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.098238 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483"} err="failed to get container status \"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483\": rpc error: code = NotFound desc = could not find container \"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483\": container with ID starting with 75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483 not found: ID does not exist" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.185393 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.199327 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.582920 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583639 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583655 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583669 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583674 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583685 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583691 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583707 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583713 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583872 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583886 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583896 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583911 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.584507 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.595082 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.654177 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.654341 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.673825 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" path="/var/lib/kubelet/pods/596ad32f-9087-4dbe-a495-8bf03200cd60/volumes" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.674419 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" path="/var/lib/kubelet/pods/96f53d5c-8b27-4810-a760-f7c9a4ee567b/volumes" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.689155 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.690545 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.692746 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.704571 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.706337 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.712953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.721788 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757247 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757901 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.767997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.788013 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.792722 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.795833 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.801406 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863737 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863797 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863859 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.864670 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.864869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.884002 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.903169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.905468 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.905570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.907134 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.911231 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99"} Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915790 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" containerID="cri-o://e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915882 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" containerID="cri-o://427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915884 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" containerID="cri-o://c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915940 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" containerID="cri-o://dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915899 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.929908 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerID="96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e" exitCode=0 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.929992 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerDied","Data":"96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e"} Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.937139 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.963726 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.915989563 podStartE2EDuration="11.963580989s" podCreationTimestamp="2026-03-13 20:48:46 +0000 UTC" firstStartedPulling="2026-03-13 20:48:52.640350321 +0000 UTC m=+1263.661466212" lastFinishedPulling="2026-03-13 20:48:56.687941747 +0000 UTC m=+1267.709057638" observedRunningTime="2026-03-13 20:48:57.952074305 +0000 UTC m=+1268.973190206" watchObservedRunningTime="2026-03-13 20:48:57.963580989 +0000 UTC m=+1268.984696880" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.970593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.002465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.008104 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.034300 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.076902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.076986 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.085484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.100699 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.102189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.106663 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.110822 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.120057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.152240 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.178931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.179087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.280214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.280608 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.281251 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.327549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.343943 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.344220 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" containerID="cri-o://6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6" gracePeriod=30 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.344817 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" containerID="cri-o://5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343" gracePeriod=30 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.368066 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.424343 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.629281 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:48:58 crc kubenswrapper[4790]: W0313 20:48:58.631731 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c861107_6a1d_49f7_bc63_b95008ee5ddc.slice/crio-42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe WatchSource:0}: Error finding container 42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe: Status 404 returned error can't find the container with id 42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.721077 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803751 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803811 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804165 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804208 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804448 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.810115 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs" (OuterVolumeSpecName: "logs") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.810434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.825421 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.825547 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts" (OuterVolumeSpecName: "scripts") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.836670 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz" (OuterVolumeSpecName: "kube-api-access-j2cwz") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "kube-api-access-j2cwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.889497 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907249 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907278 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907287 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907307 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907318 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.916956 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.948347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kq55v" event={"ID":"86c0a379-8f0b-4414-863c-eaed0745ce2d","Type":"ContainerStarted","Data":"e654e8a2a2f4ddba0cbe74c5ccd432a4c0611cace807bcb47d1cab56837685c1"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986127 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99" exitCode=0 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986173 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38" exitCode=2 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986204 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9" exitCode=0 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986296 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.992093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe05-account-create-update-dwwd8" event={"ID":"1a4ef124-b4dd-43df-bdfb-97c65685977c","Type":"ContainerStarted","Data":"a482d341a0588b1286360a3c7bf6118a7d5c154aafca78a5c2aa6c70a4917ca8"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.001553 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.003761 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.009477 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.010335 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.010404 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.010576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjv8c" event={"ID":"9c861107-6a1d-49f7-bc63-b95008ee5ddc","Type":"ContainerStarted","Data":"42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.013081 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerDied","Data":"ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.013101 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.013131 4790 scope.go:117] "RemoveContainer" containerID="96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.024605 4790 generic.go:334] "Generic (PLEG): container finished" podID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerID="6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6" exitCode=143 Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.024666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerDied","Data":"6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.056546 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data" (OuterVolumeSpecName: "config-data") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.078704 4790 scope.go:117] "RemoveContainer" containerID="4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.097327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.119268 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.119306 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.135721 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.146091 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:48:59 crc kubenswrapper[4790]: W0313 20:48:59.157344 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00f4f78b_ccfb_4413_9a81_d5b461a5e319.slice/crio-69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809 WatchSource:0}: Error finding container 69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809: Status 404 returned error can't find the container with id 69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809 Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.414056 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.452550 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.465240 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: E0313 20:48:59.465784 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.465798 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" Mar 13 20:48:59 crc kubenswrapper[4790]: E0313 20:48:59.465815 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.465820 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.466013 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.466026 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.473539 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.476090 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.476412 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.486786 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536425 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536514 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6rb\" (UniqueName: \"kubernetes.io/projected/8c1c1847-eb77-4170-8034-e58ba375ad84-kube-api-access-rn6rb\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536590 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-logs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638814 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-logs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639004 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6rb\" (UniqueName: \"kubernetes.io/projected/8c1c1847-eb77-4170-8034-e58ba375ad84-kube-api-access-rn6rb\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639085 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639561 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.640325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.640674 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-logs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.649484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.649952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.650091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.651408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.661414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6rb\" (UniqueName: \"kubernetes.io/projected/8c1c1847-eb77-4170-8034-e58ba375ad84-kube-api-access-rn6rb\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.684542 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" path="/var/lib/kubelet/pods/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2/volumes" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.686811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.929261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.039629 4790 generic.go:334] "Generic (PLEG): container finished" podID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerID="749c82e4067fc52a2714101b9401b4c82b0470e8a2bd0821a82732111bf3a2ae" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.039858 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kq55v" event={"ID":"86c0a379-8f0b-4414-863c-eaed0745ce2d","Type":"ContainerDied","Data":"749c82e4067fc52a2714101b9401b4c82b0470e8a2bd0821a82732111bf3a2ae"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.042195 4790 generic.go:334] "Generic (PLEG): container finished" podID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerID="ac99b8592ceb7c3e6a37fbb0c9de0300f9c9ee5a2b4807abffe2d2ed52e8fe04" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.042256 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe05-account-create-update-dwwd8" event={"ID":"1a4ef124-b4dd-43df-bdfb-97c65685977c","Type":"ContainerDied","Data":"ac99b8592ceb7c3e6a37fbb0c9de0300f9c9ee5a2b4807abffe2d2ed52e8fe04"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.044766 4790 generic.go:334] "Generic (PLEG): container finished" podID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerID="c0e58f35f1d7b48efbdbbc91a297aa591c210bb71e60644cb81c14c40a9e45cb" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.044841 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjv8c" event={"ID":"9c861107-6a1d-49f7-bc63-b95008ee5ddc","Type":"ContainerDied","Data":"c0e58f35f1d7b48efbdbbc91a297aa591c210bb71e60644cb81c14c40a9e45cb"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.080131 4790 generic.go:334] "Generic (PLEG): container finished" podID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerID="ecda3f7499b0977157d22e381725d43a5571bfd9425676b723008c4d5d967330" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.080286 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" event={"ID":"536b2b85-21d0-47ba-8825-998dcb7b0058","Type":"ContainerDied","Data":"ecda3f7499b0977157d22e381725d43a5571bfd9425676b723008c4d5d967330"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.080353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" event={"ID":"536b2b85-21d0-47ba-8825-998dcb7b0058","Type":"ContainerStarted","Data":"a28205db680e41155b7e2c6e7dd8da8bc4d10a1ee2a526bd3778cf937317c277"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.125581 4790 generic.go:334] "Generic (PLEG): container finished" podID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerID="2532c9c9471a4f51d2c72742172102590d5f8b86465110fbcffff19c31b75b68" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.125647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" event={"ID":"00f4f78b-ccfb-4413-9a81-d5b461a5e319","Type":"ContainerDied","Data":"2532c9c9471a4f51d2c72742172102590d5f8b86465110fbcffff19c31b75b68"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.125685 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" event={"ID":"00f4f78b-ccfb-4413-9a81-d5b461a5e319","Type":"ContainerStarted","Data":"69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.128105 4790 generic.go:334] "Generic (PLEG): container finished" podID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerID="15f4fd3d9e2092ff500a17b34ac7be646f532a2e8275aea162c7ec8133dbdbed" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.128147 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lrnph" event={"ID":"dcc0f61e-f0ce-4443-9eec-0488ff92b388","Type":"ContainerDied","Data":"15f4fd3d9e2092ff500a17b34ac7be646f532a2e8275aea162c7ec8133dbdbed"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.128189 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lrnph" event={"ID":"dcc0f61e-f0ce-4443-9eec-0488ff92b388","Type":"ContainerStarted","Data":"9fc00c50ad54c36a8895a69bc200716f7b29a72c8e6c77a86f7e0ef0f4300cd7"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.314228 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.475831 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.139071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c1c1847-eb77-4170-8034-e58ba375ad84","Type":"ContainerStarted","Data":"212ef3834c313c563c3500c5f8a4cd559dfe0819e1a567dfece2c1a2feab01d6"} Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.481436 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.586037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"1a4ef124-b4dd-43df-bdfb-97c65685977c\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.586799 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"1a4ef124-b4dd-43df-bdfb-97c65685977c\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.588033 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a4ef124-b4dd-43df-bdfb-97c65685977c" (UID: "1a4ef124-b4dd-43df-bdfb-97c65685977c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.599847 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9" (OuterVolumeSpecName: "kube-api-access-fn4b9") pod "1a4ef124-b4dd-43df-bdfb-97c65685977c" (UID: "1a4ef124-b4dd-43df-bdfb-97c65685977c"). InnerVolumeSpecName "kube-api-access-fn4b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.689721 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.689763 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.865747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.867105 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.883318 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.893548 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.893626 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.896301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c861107-6a1d-49f7-bc63-b95008ee5ddc" (UID: "9c861107-6a1d-49f7-bc63-b95008ee5ddc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.900455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42" (OuterVolumeSpecName: "kube-api-access-6hr42") pod "9c861107-6a1d-49f7-bc63-b95008ee5ddc" (UID: "9c861107-6a1d-49f7-bc63-b95008ee5ddc"). InnerVolumeSpecName "kube-api-access-6hr42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.902094 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.990898 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995259 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995855 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995875 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.996356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcc0f61e-f0ce-4443-9eec-0488ff92b388" (UID: "dcc0f61e-f0ce-4443-9eec-0488ff92b388"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.003230 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cd9b448d6-w8fcr" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" containerID="cri-o://f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd" gracePeriod=30 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.003476 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cd9b448d6-w8fcr" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" containerID="cri-o://963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b" gracePeriod=30 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.008729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs" (OuterVolumeSpecName: "kube-api-access-ddvrs") pod "dcc0f61e-f0ce-4443-9eec-0488ff92b388" (UID: "dcc0f61e-f0ce-4443-9eec-0488ff92b388"). InnerVolumeSpecName "kube-api-access-ddvrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.033183 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.039630 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.097798 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.097832 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.102554 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.166856 4790 generic.go:334] "Generic (PLEG): container finished" podID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerID="f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd" exitCode=143 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.166928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerDied","Data":"f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.174128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c1c1847-eb77-4170-8034-e58ba375ad84","Type":"ContainerStarted","Data":"fbaa139db9b0d8e939decfd665a0d823d372df7d7447765076205b24cf476904"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.187041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjv8c" event={"ID":"9c861107-6a1d-49f7-bc63-b95008ee5ddc","Type":"ContainerDied","Data":"42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.187093 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.187171 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"536b2b85-21d0-47ba-8825-998dcb7b0058\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202570 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"536b2b85-21d0-47ba-8825-998dcb7b0058\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202593 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"86c0a379-8f0b-4414-863c-eaed0745ce2d\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202621 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202690 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"86c0a379-8f0b-4414-863c-eaed0745ce2d\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.203730 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "536b2b85-21d0-47ba-8825-998dcb7b0058" (UID: "536b2b85-21d0-47ba-8825-998dcb7b0058"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.203917 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86c0a379-8f0b-4414-863c-eaed0745ce2d" (UID: "86c0a379-8f0b-4414-863c-eaed0745ce2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.205008 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00f4f78b-ccfb-4413-9a81-d5b461a5e319" (UID: "00f4f78b-ccfb-4413-9a81-d5b461a5e319"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.208865 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9" (OuterVolumeSpecName: "kube-api-access-q4vj9") pod "86c0a379-8f0b-4414-863c-eaed0745ce2d" (UID: "86c0a379-8f0b-4414-863c-eaed0745ce2d"). InnerVolumeSpecName "kube-api-access-q4vj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.212886 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5" (OuterVolumeSpecName: "kube-api-access-qdhw5") pod "536b2b85-21d0-47ba-8825-998dcb7b0058" (UID: "536b2b85-21d0-47ba-8825-998dcb7b0058"). InnerVolumeSpecName "kube-api-access-qdhw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.215025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq" (OuterVolumeSpecName: "kube-api-access-jg9bq") pod "00f4f78b-ccfb-4413-9a81-d5b461a5e319" (UID: "00f4f78b-ccfb-4413-9a81-d5b461a5e319"). InnerVolumeSpecName "kube-api-access-jg9bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.220516 4790 generic.go:334] "Generic (PLEG): container finished" podID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerID="5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343" exitCode=0 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.220576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerDied","Data":"5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.232531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lrnph" event={"ID":"dcc0f61e-f0ce-4443-9eec-0488ff92b388","Type":"ContainerDied","Data":"9fc00c50ad54c36a8895a69bc200716f7b29a72c8e6c77a86f7e0ef0f4300cd7"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.232576 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fc00c50ad54c36a8895a69bc200716f7b29a72c8e6c77a86f7e0ef0f4300cd7" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.232646 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.238114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kq55v" event={"ID":"86c0a379-8f0b-4414-863c-eaed0745ce2d","Type":"ContainerDied","Data":"e654e8a2a2f4ddba0cbe74c5ccd432a4c0611cace807bcb47d1cab56837685c1"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.238159 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e654e8a2a2f4ddba0cbe74c5ccd432a4c0611cace807bcb47d1cab56837685c1" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.238226 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.254097 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.254107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe05-account-create-update-dwwd8" event={"ID":"1a4ef124-b4dd-43df-bdfb-97c65685977c","Type":"ContainerDied","Data":"a482d341a0588b1286360a3c7bf6118a7d5c154aafca78a5c2aa6c70a4917ca8"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.254149 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a482d341a0588b1286360a3c7bf6118a7d5c154aafca78a5c2aa6c70a4917ca8" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.261490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" event={"ID":"536b2b85-21d0-47ba-8825-998dcb7b0058","Type":"ContainerDied","Data":"a28205db680e41155b7e2c6e7dd8da8bc4d10a1ee2a526bd3778cf937317c277"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.261537 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a28205db680e41155b7e2c6e7dd8da8bc4d10a1ee2a526bd3778cf937317c277" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.261593 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.280496 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.280528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" event={"ID":"00f4f78b-ccfb-4413-9a81-d5b461a5e319","Type":"ContainerDied","Data":"69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.280762 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.301528 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305789 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305834 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305848 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305858 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305870 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305881 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.406946 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407256 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407453 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407521 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407618 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs" (OuterVolumeSpecName: "logs") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.408136 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.408827 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.408852 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.413450 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.414355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c" (OuterVolumeSpecName: "kube-api-access-qg82c") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "kube-api-access-qg82c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.418710 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts" (OuterVolumeSpecName: "scripts") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.467247 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.519265 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.521561 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.521677 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.522169 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.549023 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.551149 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.554528 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data" (OuterVolumeSpecName: "config-data") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.625632 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.625678 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.625694 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.293753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c1c1847-eb77-4170-8034-e58ba375ad84","Type":"ContainerStarted","Data":"7f1ce3af76cd7e08593479522074ad8373a74ed940c79b7138f7c00e89bb3da5"} Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.298638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerDied","Data":"b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd"} Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.298691 4790 scope.go:117] "RemoveContainer" containerID="5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.298819 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.310461 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22" exitCode=0 Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.310510 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22"} Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.339533 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.339509095 podStartE2EDuration="4.339509095s" podCreationTimestamp="2026-03-13 20:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:03.333655265 +0000 UTC m=+1274.354771156" watchObservedRunningTime="2026-03-13 20:49:03.339509095 +0000 UTC m=+1274.360624986" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.367124 4790 scope.go:117] "RemoveContainer" containerID="6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.399441 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.410464 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.460612 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461026 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461046 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461056 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461062 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461071 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461077 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461089 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461097 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461107 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461113 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461128 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461133 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461143 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461150 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461158 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461163 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461320 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461334 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461346 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461355 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461362 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461399 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461409 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461428 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.462586 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.470669 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.470820 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.470984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.644824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.644904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.644959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645051 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645075 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw5hn\" (UniqueName: \"kubernetes.io/projected/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-kube-api-access-vw5hn\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645459 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.671636 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" path="/var/lib/kubelet/pods/773bad92-580e-4a9c-9ba5-eef9d8bbc40d/volumes" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.687944 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.747850 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.748249 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.748668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.748838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749123 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749307 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw5hn\" (UniqueName: \"kubernetes.io/projected/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-kube-api-access-vw5hn\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749501 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.750108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.750611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.750951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.756272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.758523 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.775312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw5hn\" (UniqueName: \"kubernetes.io/projected/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-kube-api-access-vw5hn\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.775664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.776638 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.792078 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852284 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852418 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852511 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852607 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852669 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852692 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852845 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.853152 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.853434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.856169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.856978 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts" (OuterVolumeSpecName: "scripts") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.860583 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9" (OuterVolumeSpecName: "kube-api-access-xntq9") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "kube-api-access-xntq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.900478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.950274 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956099 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956128 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956138 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956148 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956160 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.009643 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data" (OuterVolumeSpecName: "config-data") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.058363 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.324936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0"} Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.324961 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.325003 4790 scope.go:117] "RemoveContainer" containerID="dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.347011 4790 scope.go:117] "RemoveContainer" containerID="c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.367465 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.378097 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.381514 4790 scope.go:117] "RemoveContainer" containerID="427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.388707 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389148 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389167 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389183 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389189 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389207 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389214 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389239 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389245 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389414 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389423 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389440 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389451 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.390993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.395494 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.395897 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.404790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.409588 4790 scope.go:117] "RemoveContainer" containerID="e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.438954 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: W0313 20:49:04.441755 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b10e44_e0ce_4568_b33c_dd9855d61fd7.slice/crio-e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017 WatchSource:0}: Error finding container e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017: Status 404 returned error can't find the container with id e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017 Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.568979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569362 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671616 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671670 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671733 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671873 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.673858 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.677787 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.679515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.680249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.703063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.703501 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.731574 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.210418 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.364884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"78bf65debeefb94f3999e2e736301029c0c817dd3ba45f159bad72d2cdf7dd64"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.367723 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5b10e44-e0ce-4568-b33c-dd9855d61fd7","Type":"ContainerStarted","Data":"54ab8d08cdb0518dcdc67eeefbbb5b95f198e57d4e571b3e396b5da4783891d6"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.367754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5b10e44-e0ce-4568-b33c-dd9855d61fd7","Type":"ContainerStarted","Data":"e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.369287 4790 generic.go:334] "Generic (PLEG): container finished" podID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerID="963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b" exitCode=0 Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.369313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerDied","Data":"963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.562789 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.670642 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc306890-4355-4f40-abc0-11753b34d120" path="/var/lib/kubelet/pods/dc306890-4355-4f40-abc0-11753b34d120/volumes" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700040 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700131 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700152 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700286 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700317 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700970 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs" (OuterVolumeSpecName: "logs") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.706397 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts" (OuterVolumeSpecName: "scripts") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.719569 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj" (OuterVolumeSpecName: "kube-api-access-xl6nj") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "kube-api-access-xl6nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.794640 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.798471 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data" (OuterVolumeSpecName: "config-data") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802312 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802351 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802367 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802399 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802411 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.819590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.836577 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.903788 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.903829 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.083719 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.389823 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerDied","Data":"5d216af4785a04f3e8536b6945d51a46024ad4cfced21083156e56a883fa3cab"} Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.389856 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.391025 4790 scope.go:117] "RemoveContainer" containerID="963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.400971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df"} Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.404195 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5b10e44-e0ce-4568-b33c-dd9855d61fd7","Type":"ContainerStarted","Data":"9a171e4fd7775bef92c105269dc6240f03221a50b1282faf0071e3ee05776514"} Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.422148 4790 scope.go:117] "RemoveContainer" containerID="f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.438349 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.438326703 podStartE2EDuration="3.438326703s" podCreationTimestamp="2026-03-13 20:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:06.430347305 +0000 UTC m=+1277.451463206" watchObservedRunningTime="2026-03-13 20:49:06.438326703 +0000 UTC m=+1277.459442594" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.462775 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.472398 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:49:07 crc kubenswrapper[4790]: I0313 20:49:07.416469 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3"} Mar 13 20:49:07 crc kubenswrapper[4790]: I0313 20:49:07.672299 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" path="/var/lib/kubelet/pods/88252e8c-21d9-402a-bab0-9f61b5eb3a70/volumes" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.179021 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:49:08 crc kubenswrapper[4790]: E0313 20:49:08.179776 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.179800 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" Mar 13 20:49:08 crc kubenswrapper[4790]: E0313 20:49:08.179822 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.179830 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.180071 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.180094 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.180853 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.184049 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2wmdt" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.184268 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.184439 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.201347 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250470 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250693 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.352538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.353112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.353142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.353215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.359146 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.360069 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.371203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.376126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.428522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2"} Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.498210 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.972089 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:49:08 crc kubenswrapper[4790]: W0313 20:49:08.976396 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b866fe_5d7d_46ab_9074_b93ddc7724f0.slice/crio-b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff WatchSource:0}: Error finding container b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff: Status 404 returned error can't find the container with id b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff Mar 13 20:49:09 crc kubenswrapper[4790]: I0313 20:49:09.442666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerStarted","Data":"b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff"} Mar 13 20:49:09 crc kubenswrapper[4790]: I0313 20:49:09.930757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:49:09 crc kubenswrapper[4790]: I0313 20:49:09.931108 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.000784 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.016249 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03"} Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453944 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453968 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453945 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" containerID="cri-o://926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.454014 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" containerID="cri-o://786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.454041 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" containerID="cri-o://b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.454099 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" containerID="cri-o://327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.482340 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.101172259 podStartE2EDuration="6.482319522s" podCreationTimestamp="2026-03-13 20:49:04 +0000 UTC" firstStartedPulling="2026-03-13 20:49:05.247147005 +0000 UTC m=+1276.268262896" lastFinishedPulling="2026-03-13 20:49:09.628294268 +0000 UTC m=+1280.649410159" observedRunningTime="2026-03-13 20:49:10.477698175 +0000 UTC m=+1281.498814096" watchObservedRunningTime="2026-03-13 20:49:10.482319522 +0000 UTC m=+1281.503435413" Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475550 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" exitCode=0 Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475842 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" exitCode=2 Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475851 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" exitCode=0 Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475625 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03"} Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475948 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2"} Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3"} Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.027349 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132427 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132559 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132586 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132673 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.134437 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.134787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.138983 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts" (OuterVolumeSpecName: "scripts") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.139218 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j" (OuterVolumeSpecName: "kube-api-access-nfs8j") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "kube-api-access-nfs8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.167716 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.216304 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234645 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234689 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234700 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234712 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234722 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234733 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.239054 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data" (OuterVolumeSpecName: "config-data") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.335712 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489453 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" exitCode=0 Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df"} Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"78bf65debeefb94f3999e2e736301029c0c817dd3ba45f159bad72d2cdf7dd64"} Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489558 4790 scope.go:117] "RemoveContainer" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489713 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.528579 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.549578 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.566772 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567132 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567144 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567166 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567172 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567181 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567187 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567205 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567211 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567974 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567996 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.568008 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.568030 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.569988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.576290 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577480 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577508 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577575 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577854 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.617911 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746477 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746571 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746610 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746838 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.849028 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.849133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.849692 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.850581 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.854941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.855466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.856261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.856827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.871160 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.897646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.672683 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" path="/var/lib/kubelet/pods/e2f1d856-14cc-48bb-b155-e74f8e5a9b56/volumes" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.856693 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.856768 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.897511 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.908006 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:14 crc kubenswrapper[4790]: I0313 20:49:14.523426 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:14 crc kubenswrapper[4790]: I0313 20:49:14.523785 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:14 crc kubenswrapper[4790]: I0313 20:49:14.992325 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:16 crc kubenswrapper[4790]: I0313 20:49:16.659222 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:16 crc kubenswrapper[4790]: I0313 20:49:16.659661 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:49:16 crc kubenswrapper[4790]: I0313 20:49:16.674918 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.385083 4790 scope.go:117] "RemoveContainer" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.460985 4790 scope.go:117] "RemoveContainer" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.599607 4790 scope.go:117] "RemoveContainer" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.636539 4790 scope.go:117] "RemoveContainer" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.637039 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03\": container with ID starting with 327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03 not found: ID does not exist" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637097 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03"} err="failed to get container status \"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03\": rpc error: code = NotFound desc = could not find container \"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03\": container with ID starting with 327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03 not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637129 4790 scope.go:117] "RemoveContainer" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.637503 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2\": container with ID starting with 786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2 not found: ID does not exist" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637525 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2"} err="failed to get container status \"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2\": rpc error: code = NotFound desc = could not find container \"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2\": container with ID starting with 786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2 not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637537 4790 scope.go:117] "RemoveContainer" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.637983 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3\": container with ID starting with b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3 not found: ID does not exist" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.638000 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3"} err="failed to get container status \"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3\": rpc error: code = NotFound desc = could not find container \"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3\": container with ID starting with b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3 not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.638012 4790 scope.go:117] "RemoveContainer" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.647726 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df\": container with ID starting with 926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df not found: ID does not exist" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.647760 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df"} err="failed to get container status \"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df\": rpc error: code = NotFound desc = could not find container \"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df\": container with ID starting with 926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.941984 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:22 crc kubenswrapper[4790]: I0313 20:49:22.629269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"1d97979fdd68d0748ba8fa4d7f33307ed19c474507126a6137c97c42d6089130"} Mar 13 20:49:23 crc kubenswrapper[4790]: I0313 20:49:23.643850 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerStarted","Data":"6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f"} Mar 13 20:49:23 crc kubenswrapper[4790]: I0313 20:49:23.680751 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-82klj" podStartSLOduration=2.100135928 podStartE2EDuration="15.680726442s" podCreationTimestamp="2026-03-13 20:49:08 +0000 UTC" firstStartedPulling="2026-03-13 20:49:08.978883466 +0000 UTC m=+1279.999999357" lastFinishedPulling="2026-03-13 20:49:22.55947398 +0000 UTC m=+1293.580589871" observedRunningTime="2026-03-13 20:49:23.669094435 +0000 UTC m=+1294.690210326" watchObservedRunningTime="2026-03-13 20:49:23.680726442 +0000 UTC m=+1294.701842333" Mar 13 20:49:23 crc kubenswrapper[4790]: I0313 20:49:23.693057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289"} Mar 13 20:49:24 crc kubenswrapper[4790]: I0313 20:49:24.682280 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280"} Mar 13 20:49:24 crc kubenswrapper[4790]: I0313 20:49:24.682890 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1"} Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c"} Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.704613 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703463 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" containerID="cri-o://909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703455 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" containerID="cri-o://2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703483 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" containerID="cri-o://407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703455 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" containerID="cri-o://2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.744322 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=11.213326324 podStartE2EDuration="14.744305009s" podCreationTimestamp="2026-03-13 20:49:12 +0000 UTC" firstStartedPulling="2026-03-13 20:49:22.545593292 +0000 UTC m=+1293.566709183" lastFinishedPulling="2026-03-13 20:49:26.076571977 +0000 UTC m=+1297.097687868" observedRunningTime="2026-03-13 20:49:26.737509653 +0000 UTC m=+1297.758625554" watchObservedRunningTime="2026-03-13 20:49:26.744305009 +0000 UTC m=+1297.765420900" Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718088 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" exitCode=0 Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718441 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" exitCode=2 Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718453 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" exitCode=0 Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718160 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c"} Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280"} Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1"} Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.684596 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768637 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" exitCode=0 Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289"} Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768705 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"1d97979fdd68d0748ba8fa4d7f33307ed19c474507126a6137c97c42d6089130"} Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768722 4790 scope.go:117] "RemoveContainer" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768834 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.789285 4790 scope.go:117] "RemoveContainer" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.812867 4790 scope.go:117] "RemoveContainer" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.835235 4790 scope.go:117] "RemoveContainer" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843579 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843715 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843744 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843796 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843874 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843926 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843952 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.844326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.844439 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.849867 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts" (OuterVolumeSpecName: "scripts") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.851051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8" (OuterVolumeSpecName: "kube-api-access-6ppp8") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "kube-api-access-6ppp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.858853 4790 scope.go:117] "RemoveContainer" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.859751 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c\": container with ID starting with 2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c not found: ID does not exist" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.859811 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c"} err="failed to get container status \"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c\": rpc error: code = NotFound desc = could not find container \"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c\": container with ID starting with 2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.859845 4790 scope.go:117] "RemoveContainer" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.860327 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280\": container with ID starting with 909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280 not found: ID does not exist" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860409 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280"} err="failed to get container status \"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280\": rpc error: code = NotFound desc = could not find container \"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280\": container with ID starting with 909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280 not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860449 4790 scope.go:117] "RemoveContainer" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.860827 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1\": container with ID starting with 407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1 not found: ID does not exist" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860861 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1"} err="failed to get container status \"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1\": rpc error: code = NotFound desc = could not find container \"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1\": container with ID starting with 407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1 not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860881 4790 scope.go:117] "RemoveContainer" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.861185 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289\": container with ID starting with 2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289 not found: ID does not exist" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.861215 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289"} err="failed to get container status \"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289\": rpc error: code = NotFound desc = could not find container \"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289\": container with ID starting with 2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289 not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.886737 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.918702 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.937474 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data" (OuterVolumeSpecName: "config-data") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945535 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945578 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945587 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945596 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945605 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945612 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.100820 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.109610 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125138 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125534 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125558 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125670 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125684 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125702 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125709 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125726 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125733 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125885 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125899 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125919 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125932 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.128145 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.131010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.131018 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.137747 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.250679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251192 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251329 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251478 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.353858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.353908 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354118 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354136 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.355195 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.359187 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.359583 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.366442 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.367593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.372359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.450824 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.896251 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: W0313 20:49:32.897174 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe WatchSource:0}: Error finding container d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe: Status 404 returned error can't find the container with id d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.670148 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" path="/var/lib/kubelet/pods/e0bf0d8a-16c3-4a20-905d-f08a5906ded3/volumes" Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.804877 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603"} Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.804948 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe"} Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.808657 4790 generic.go:334] "Generic (PLEG): container finished" podID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerID="6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f" exitCode=0 Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.808753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerDied","Data":"6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f"} Mar 13 20:49:34 crc kubenswrapper[4790]: I0313 20:49:34.837266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6"} Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.169401 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.309931 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.310032 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.310223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.310292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.317549 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts" (OuterVolumeSpecName: "scripts") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.339090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7" (OuterVolumeSpecName: "kube-api-access-4hlv7") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "kube-api-access-4hlv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.345749 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data" (OuterVolumeSpecName: "config-data") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.349899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.412927 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.413243 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.413298 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.413314 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.849865 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.849857 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerDied","Data":"b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff"} Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.851091 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.853473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922"} Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.933137 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 20:49:35 crc kubenswrapper[4790]: E0313 20:49:35.933797 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerName="nova-cell0-conductor-db-sync" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.933891 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerName="nova-cell0-conductor-db-sync" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.934133 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerName="nova-cell0-conductor-db-sync" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.934834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.936928 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.938330 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2wmdt" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.946265 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.024297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.024410 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qf72\" (UniqueName: \"kubernetes.io/projected/253ef3a1-1764-4120-a5f8-db908a0e7fd4-kube-api-access-7qf72\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.024506 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.126082 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.126166 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.126218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qf72\" (UniqueName: \"kubernetes.io/projected/253ef3a1-1764-4120-a5f8-db908a0e7fd4-kube-api-access-7qf72\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.162843 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.162974 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.166396 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qf72\" (UniqueName: \"kubernetes.io/projected/253ef3a1-1764-4120-a5f8-db908a0e7fd4-kube-api-access-7qf72\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.252791 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.732943 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 20:49:36 crc kubenswrapper[4790]: W0313 20:49:36.740257 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod253ef3a1_1764_4120_a5f8_db908a0e7fd4.slice/crio-c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a WatchSource:0}: Error finding container c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a: Status 404 returned error can't find the container with id c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.865194 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"253ef3a1-1764-4120-a5f8-db908a0e7fd4","Type":"ContainerStarted","Data":"c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a"} Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.877761 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46"} Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.878639 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.882254 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"253ef3a1-1764-4120-a5f8-db908a0e7fd4","Type":"ContainerStarted","Data":"961aa80bee89b00f627b8513e50e0b372633913ae24bf7e58526880debf770ae"} Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.882427 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.915677 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.226079191 podStartE2EDuration="5.915659282s" podCreationTimestamp="2026-03-13 20:49:32 +0000 UTC" firstStartedPulling="2026-03-13 20:49:32.899626073 +0000 UTC m=+1303.920741964" lastFinishedPulling="2026-03-13 20:49:36.589206164 +0000 UTC m=+1307.610322055" observedRunningTime="2026-03-13 20:49:37.903547432 +0000 UTC m=+1308.924663323" watchObservedRunningTime="2026-03-13 20:49:37.915659282 +0000 UTC m=+1308.936775173" Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.926831 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.926810016 podStartE2EDuration="2.926810016s" podCreationTimestamp="2026-03-13 20:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:37.923034274 +0000 UTC m=+1308.944150165" watchObservedRunningTime="2026-03-13 20:49:37.926810016 +0000 UTC m=+1308.947925897" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.285914 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.747354 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.748895 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.751633 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.755685 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.757778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836190 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938452 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938478 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.947754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.955010 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.957408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.974729 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.976415 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.977674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.980218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.994630 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.073253 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.073476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.077025 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.080278 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.115263 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.164302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.164671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.164719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.262453 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.264984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.272802 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301110 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301185 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301252 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301578 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.317498 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.319063 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.326498 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.327760 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.329945 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.330180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.343981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.349789 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.355964 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.364224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.401764 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402394 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402555 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.404268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.406689 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.409406 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.431325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504139 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504194 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504227 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504273 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504425 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504502 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.506050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.509668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.510272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.524617 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.548371 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606584 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606954 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.607527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.607526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.607808 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608125 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608965 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.609604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.611971 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.621208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.623910 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.630543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.684873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.732834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.756054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.781407 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:49:42 crc kubenswrapper[4790]: W0313 20:49:42.823471 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968 WatchSource:0}: Error finding container 55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968: Status 404 returned error can't find the container with id 55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968 Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.925720 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.945746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerStarted","Data":"55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968"} Mar 13 20:49:42 crc kubenswrapper[4790]: W0313 20:49:42.950967 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401 WatchSource:0}: Error finding container 23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401: Status 404 returned error can't find the container with id 23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401 Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.189097 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.287572 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.289034 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.291772 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.292335 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.298890 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.330088 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.433739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.433784 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.433854 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.434124 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: W0313 20:49:43.475559 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7082b53_1345_4c47_a9bf_b87d9e1fd3ca.slice/crio-51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd WatchSource:0}: Error finding container 51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd: Status 404 returned error can't find the container with id 51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.484158 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536638 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.542656 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.542986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.543459 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.554243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.589922 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.613318 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.959207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerStarted","Data":"04571522f47168f89424bb71a3f0416a30a9cbe088abe70c50ac3387dcafbc56"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.961187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerStarted","Data":"670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.962895 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerStarted","Data":"23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.965612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerStarted","Data":"51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.967131 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerStarted","Data":"703a1bf7672ad738d5a0561a4b2308100e00dc344ee885923cb3275bce620370"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.971447 4790 generic.go:334] "Generic (PLEG): container finished" podID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerID="62406a3417f49cd6fee467ec15aafed59672de36ebec3945dba28321522a57f0" exitCode=0 Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.971499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerDied","Data":"62406a3417f49cd6fee467ec15aafed59672de36ebec3945dba28321522a57f0"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.971529 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerStarted","Data":"d410e2281728cc5d35324b5d9753eac6a283696daed20ea1b6c874c3b410e22c"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.979486 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gj4j7" podStartSLOduration=2.97946824 podStartE2EDuration="2.97946824s" podCreationTimestamp="2026-03-13 20:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:43.976247812 +0000 UTC m=+1314.997363703" watchObservedRunningTime="2026-03-13 20:49:43.97946824 +0000 UTC m=+1315.000584131" Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.118058 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:49:44 crc kubenswrapper[4790]: W0313 20:49:44.968303 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255451e0_9cb8_424f_a327_6e7ef4e4d775.slice/crio-e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c WatchSource:0}: Error finding container e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c: Status 404 returned error can't find the container with id e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.985988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerStarted","Data":"e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c"} Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.990752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerStarted","Data":"31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f"} Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.990811 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:46 crc kubenswrapper[4790]: I0313 20:49:46.024982 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" podStartSLOduration=4.02495418 podStartE2EDuration="4.02495418s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:45.011178099 +0000 UTC m=+1316.032293990" watchObservedRunningTime="2026-03-13 20:49:46.02495418 +0000 UTC m=+1317.046070071" Mar 13 20:49:46 crc kubenswrapper[4790]: I0313 20:49:46.026991 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:46 crc kubenswrapper[4790]: I0313 20:49:46.036638 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.015728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerStarted","Data":"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.019802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerStarted","Data":"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.019848 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" gracePeriod=30 Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.026464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerStarted","Data":"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.026515 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerStarted","Data":"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.029881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerStarted","Data":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.029913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerStarted","Data":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.030033 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" containerID="cri-o://d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" gracePeriod=30 Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.030147 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" containerID="cri-o://31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" gracePeriod=30 Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.044604 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.313984373 podStartE2EDuration="6.04458572s" podCreationTimestamp="2026-03-13 20:49:41 +0000 UTC" firstStartedPulling="2026-03-13 20:49:42.979174097 +0000 UTC m=+1314.000289988" lastFinishedPulling="2026-03-13 20:49:45.709775444 +0000 UTC m=+1316.730891335" observedRunningTime="2026-03-13 20:49:47.034555337 +0000 UTC m=+1318.055671248" watchObservedRunningTime="2026-03-13 20:49:47.04458572 +0000 UTC m=+1318.065701611" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.053942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerStarted","Data":"d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.062752 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.537505361 podStartE2EDuration="5.062729975s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="2026-03-13 20:49:43.185941497 +0000 UTC m=+1314.207057388" lastFinishedPulling="2026-03-13 20:49:45.711166111 +0000 UTC m=+1316.732282002" observedRunningTime="2026-03-13 20:49:47.060130524 +0000 UTC m=+1318.081246415" watchObservedRunningTime="2026-03-13 20:49:47.062729975 +0000 UTC m=+1318.083845866" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.088639 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.856090979 podStartE2EDuration="5.088614291s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="2026-03-13 20:49:43.47791968 +0000 UTC m=+1314.499035571" lastFinishedPulling="2026-03-13 20:49:45.710442982 +0000 UTC m=+1316.731558883" observedRunningTime="2026-03-13 20:49:47.0761542 +0000 UTC m=+1318.097270091" watchObservedRunningTime="2026-03-13 20:49:47.088614291 +0000 UTC m=+1318.109730182" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.105028 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.7307323500000003 podStartE2EDuration="5.105010268s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="2026-03-13 20:49:43.335629569 +0000 UTC m=+1314.356745460" lastFinishedPulling="2026-03-13 20:49:45.709907497 +0000 UTC m=+1316.731023378" observedRunningTime="2026-03-13 20:49:47.092326722 +0000 UTC m=+1318.113442643" watchObservedRunningTime="2026-03-13 20:49:47.105010268 +0000 UTC m=+1318.126126159" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.117818 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" podStartSLOduration=4.117798036 podStartE2EDuration="4.117798036s" podCreationTimestamp="2026-03-13 20:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:47.113221952 +0000 UTC m=+1318.134337843" watchObservedRunningTime="2026-03-13 20:49:47.117798036 +0000 UTC m=+1318.138913927" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.402451 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.641681 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740359 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740806 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740850 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740945 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.741951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs" (OuterVolumeSpecName: "logs") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.758773 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm" (OuterVolumeSpecName: "kube-api-access-gm7sm") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "kube-api-access-gm7sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.772573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.779691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data" (OuterVolumeSpecName: "config-data") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.794669 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843790 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843864 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843878 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843887 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.069997 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" exitCode=0 Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070038 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" exitCode=143 Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070111 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerDied","Data":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070226 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerDied","Data":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070261 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerDied","Data":"04571522f47168f89424bb71a3f0416a30a9cbe088abe70c50ac3387dcafbc56"} Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070280 4790 scope.go:117] "RemoveContainer" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.118293 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.124007 4790 scope.go:117] "RemoveContainer" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.127370 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143185 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.143628 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143648 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.143663 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143671 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143912 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143941 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.145004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.147286 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.147479 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.173050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.222802 4790 scope.go:117] "RemoveContainer" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.223239 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": container with ID starting with 31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1 not found: ID does not exist" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223288 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} err="failed to get container status \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": rpc error: code = NotFound desc = could not find container \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": container with ID starting with 31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1 not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223308 4790 scope.go:117] "RemoveContainer" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.223787 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": container with ID starting with d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe not found: ID does not exist" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223815 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} err="failed to get container status \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": rpc error: code = NotFound desc = could not find container \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": container with ID starting with d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223833 4790 scope.go:117] "RemoveContainer" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.224102 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} err="failed to get container status \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": rpc error: code = NotFound desc = could not find container \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": container with ID starting with 31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1 not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.224128 4790 scope.go:117] "RemoveContainer" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.224434 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} err="failed to get container status \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": rpc error: code = NotFound desc = could not find container \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": container with ID starting with d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254337 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.357060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.363080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.372946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.372973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.380151 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.524726 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:49 crc kubenswrapper[4790]: W0313 20:49:49.051367 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419 WatchSource:0}: Error finding container a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419: Status 404 returned error can't find the container with id a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419 Mar 13 20:49:49 crc kubenswrapper[4790]: I0313 20:49:49.055968 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:49 crc kubenswrapper[4790]: I0313 20:49:49.081063 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerStarted","Data":"a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419"} Mar 13 20:49:49 crc kubenswrapper[4790]: I0313 20:49:49.684091 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" path="/var/lib/kubelet/pods/e2fd6d31-1072-47b0-aa6b-327fac52a13b/volumes" Mar 13 20:49:50 crc kubenswrapper[4790]: I0313 20:49:50.093148 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerStarted","Data":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} Mar 13 20:49:50 crc kubenswrapper[4790]: I0313 20:49:50.094477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerStarted","Data":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} Mar 13 20:49:50 crc kubenswrapper[4790]: I0313 20:49:50.127160 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.127125124 podStartE2EDuration="2.127125124s" podCreationTimestamp="2026-03-13 20:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:50.113087241 +0000 UTC m=+1321.134203152" watchObservedRunningTime="2026-03-13 20:49:50.127125124 +0000 UTC m=+1321.148241055" Mar 13 20:49:51 crc kubenswrapper[4790]: I0313 20:49:51.107029 4790 generic.go:334] "Generic (PLEG): container finished" podID="e71d98c3-e247-448e-945e-016a6755c689" containerID="670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc" exitCode=0 Mar 13 20:49:51 crc kubenswrapper[4790]: I0313 20:49:51.108467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerDied","Data":"670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc"} Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.402594 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.432082 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.478189 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.550236 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.551193 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.640350 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts" (OuterVolumeSpecName: "scripts") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.640699 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945" (OuterVolumeSpecName: "kube-api-access-mc945") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "kube-api-access-mc945". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.666821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data" (OuterVolumeSpecName: "config-data") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.666995 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.734566 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738621 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738647 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738694 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738702 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.787010 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.787297 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" containerID="cri-o://716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb" gracePeriod=10 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.134788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerDied","Data":"55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968"} Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.134834 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.134912 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.140939 4790 generic.go:334] "Generic (PLEG): container finished" podID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerID="d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115" exitCode=0 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.141699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerDied","Data":"d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115"} Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.161562 4790 generic.go:334] "Generic (PLEG): container finished" podID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerID="716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb" exitCode=0 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.161830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerDied","Data":"716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb"} Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.194465 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.235298 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.319707 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.359921 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360039 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360118 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360264 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360311 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.361075 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.362130 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" containerID="cri-o://589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" gracePeriod=30 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.362722 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" containerID="cri-o://0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" gracePeriod=30 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.380831 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd" (OuterVolumeSpecName: "kube-api-access-5spxd") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "kube-api-access-5spxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.452518 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config" (OuterVolumeSpecName: "config") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.459366 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.461694 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462309 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462403 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462478 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462540 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.469862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.473872 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.564045 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.564319 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.634574 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.634596 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.773955 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.955337 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073652 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073825 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073898 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.074122 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.074185 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs" (OuterVolumeSpecName: "logs") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.075230 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.080111 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp" (OuterVolumeSpecName: "kube-api-access-qn2wp") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "kube-api-access-qn2wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.101104 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.109558 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data" (OuterVolumeSpecName: "config-data") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.126219 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.176939 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.177266 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.177353 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.177449 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.184172 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.184101 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerDied","Data":"90cf908fda5bfa83deaae1fd0eac95ba601f9eb9da62b0fab2c3af0677ac98b2"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.184637 4790 scope.go:117] "RemoveContainer" containerID="716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188325 4790 generic.go:334] "Generic (PLEG): container finished" podID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" exitCode=0 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188353 4790 generic.go:334] "Generic (PLEG): container finished" podID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" exitCode=143 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188372 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerDied","Data":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188414 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerDied","Data":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerDied","Data":"a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.189114 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" containerID="cri-o://f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" gracePeriod=30 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.189506 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" containerID="cri-o://1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" gracePeriod=30 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.227450 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.236110 4790 scope.go:117] "RemoveContainer" containerID="54996574df6debfb6f3430b43b232f15654c266d463f051ee19ed34e62244f6c" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.242784 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.261714 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.277078 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.290519 4790 scope.go:117] "RemoveContainer" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299429 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299890 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299909 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299929 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299936 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299951 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299957 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299969 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="init" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299977 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="init" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299997 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71d98c3-e247-448e-945e-016a6755c689" containerName="nova-manage" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300003 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71d98c3-e247-448e-945e-016a6755c689" containerName="nova-manage" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300172 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71d98c3-e247-448e-945e-016a6755c689" containerName="nova-manage" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300184 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300242 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300258 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.301335 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.303927 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.305123 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.308638 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.340930 4790 scope.go:117] "RemoveContainer" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.391654 4790 scope.go:117] "RemoveContainer" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.394524 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": container with ID starting with 0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01 not found: ID does not exist" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.394572 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} err="failed to get container status \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": rpc error: code = NotFound desc = could not find container \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": container with ID starting with 0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.394598 4790 scope.go:117] "RemoveContainer" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.398913 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": container with ID starting with 589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161 not found: ID does not exist" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.398972 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} err="failed to get container status \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": rpc error: code = NotFound desc = could not find container \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": container with ID starting with 589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.399006 4790 scope.go:117] "RemoveContainer" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.399841 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} err="failed to get container status \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": rpc error: code = NotFound desc = could not find container \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": container with ID starting with 0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.399863 4790 scope.go:117] "RemoveContainer" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.400131 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} err="failed to get container status \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": rpc error: code = NotFound desc = could not find container \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": container with ID starting with 589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486661 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732219 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732343 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732481 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.734846 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.739348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.740097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.744658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.758811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.846232 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.955496 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.060912 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.061292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.061495 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.061552 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.071590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts" (OuterVolumeSpecName: "scripts") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.071771 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd" (OuterVolumeSpecName: "kube-api-access-rlmqd") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "kube-api-access-rlmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.100787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.112127 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data" (OuterVolumeSpecName: "config-data") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163896 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163946 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163963 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163975 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.266071 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 20:49:55 crc kubenswrapper[4790]: E0313 20:49:55.266465 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerName="nova-cell1-conductor-db-sync" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.266479 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerName="nova-cell1-conductor-db-sync" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.266727 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerName="nova-cell1-conductor-db-sync" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.267356 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.271961 4790 generic.go:334] "Generic (PLEG): container finished" podID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" exitCode=143 Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.272091 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerDied","Data":"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263"} Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.282225 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" containerID="cri-o://bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" gracePeriod=30 Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.282627 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.284972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerDied","Data":"e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c"} Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.285007 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.285023 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.368069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.368431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95db\" (UniqueName: \"kubernetes.io/projected/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-kube-api-access-p95db\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.368516 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.456813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.489491 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.490471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95db\" (UniqueName: \"kubernetes.io/projected/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-kube-api-access-p95db\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.490626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.501346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.502223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.529284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95db\" (UniqueName: \"kubernetes.io/projected/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-kube-api-access-p95db\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.595800 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.688394 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" path="/var/lib/kubelet/pods/37e33a9e-1def-49b1-b1a7-81be1f5e72ee/volumes" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.689559 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" path="/var/lib/kubelet/pods/5348982d-ffd4-4226-8c69-1984dc02ffbe/volumes" Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.014757 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.293401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerStarted","Data":"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.293451 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerStarted","Data":"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.293463 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerStarted","Data":"1998e9c0ed2f49c51df1fb979275385f3c3c928b8ffbba368fee9881d45e3a34"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.295159 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b","Type":"ContainerStarted","Data":"090e00a03545aa608baa22ecbda44610983cec4d0dc2ac4b17f3618770499479"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.295184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b","Type":"ContainerStarted","Data":"36392374d77fd75658a5cb748dd738c66a6b5e5859fa85fa2b6ce243e537f157"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.295331 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.314276 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.314255326 podStartE2EDuration="2.314255326s" podCreationTimestamp="2026-03-13 20:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:56.308241482 +0000 UTC m=+1327.329357373" watchObservedRunningTime="2026-03-13 20:49:56.314255326 +0000 UTC m=+1327.335371217" Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.337253 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.337231612 podStartE2EDuration="1.337231612s" podCreationTimestamp="2026-03-13 20:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:56.328229667 +0000 UTC m=+1327.349345558" watchObservedRunningTime="2026-03-13 20:49:56.337231612 +0000 UTC m=+1327.358347503" Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.404635 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.407706 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.409893 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.409938 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.084814 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.154199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"09a61a2b-7821-476f-af33-74837a0e2026\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.154295 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"09a61a2b-7821-476f-af33-74837a0e2026\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.154370 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"09a61a2b-7821-476f-af33-74837a0e2026\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.179481 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct" (OuterVolumeSpecName: "kube-api-access-85bct") pod "09a61a2b-7821-476f-af33-74837a0e2026" (UID: "09a61a2b-7821-476f-af33-74837a0e2026"). InnerVolumeSpecName "kube-api-access-85bct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.186538 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data" (OuterVolumeSpecName: "config-data") pod "09a61a2b-7821-476f-af33-74837a0e2026" (UID: "09a61a2b-7821-476f-af33-74837a0e2026"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.196680 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09a61a2b-7821-476f-af33-74837a0e2026" (UID: "09a61a2b-7821-476f-af33-74837a0e2026"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.257890 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.258297 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.258363 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323000 4790 generic.go:334] "Generic (PLEG): container finished" podID="09a61a2b-7821-476f-af33-74837a0e2026" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" exitCode=0 Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323082 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerDied","Data":"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50"} Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323452 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerDied","Data":"23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401"} Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323479 4790 scope.go:117] "RemoveContainer" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.349009 4790 scope.go:117] "RemoveContainer" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" Mar 13 20:49:58 crc kubenswrapper[4790]: E0313 20:49:58.349633 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50\": container with ID starting with bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50 not found: ID does not exist" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.349674 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50"} err="failed to get container status \"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50\": rpc error: code = NotFound desc = could not find container \"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50\": container with ID starting with bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50 not found: ID does not exist" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.362747 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.373836 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.385572 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: E0313 20:49:58.386305 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.386584 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.389042 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.390164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.395262 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.397715 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.462719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.463272 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.463345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.565410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.565624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.565661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.569551 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.569602 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.581703 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.715702 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.200550 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.245165 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.275998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.276088 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.276126 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.276291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.277312 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs" (OuterVolumeSpecName: "logs") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.280581 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm" (OuterVolumeSpecName: "kube-api-access-h6fnm") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "kube-api-access-h6fnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.309899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data" (OuterVolumeSpecName: "config-data") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.312505 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337165 4790 generic.go:334] "Generic (PLEG): container finished" podID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" exitCode=0 Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerDied","Data":"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d"} Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337278 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerDied","Data":"703a1bf7672ad738d5a0561a4b2308100e00dc344ee885923cb3275bce620370"} Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337336 4790 scope.go:117] "RemoveContainer" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.339813 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerStarted","Data":"6e7ed25e629647fdcbcdadd57d668b2cdfbe95f3b450a5732f265251e324ddb9"} Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.372212 4790 scope.go:117] "RemoveContainer" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380372 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380421 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380445 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380457 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.385156 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.397642 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412192 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.412618 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412635 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.412648 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412655 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412835 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412852 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.413800 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.418726 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.420396 4790 scope.go:117] "RemoveContainer" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.423435 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d\": container with ID starting with 1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d not found: ID does not exist" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.423488 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d"} err="failed to get container status \"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d\": rpc error: code = NotFound desc = could not find container \"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d\": container with ID starting with 1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d not found: ID does not exist" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.423522 4790 scope.go:117] "RemoveContainer" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.425765 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263\": container with ID starting with f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263 not found: ID does not exist" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.425792 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263"} err="failed to get container status \"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263\": rpc error: code = NotFound desc = could not find container \"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263\": container with ID starting with f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263 not found: ID does not exist" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.437519 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.481917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.481994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.482040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.482098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583114 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583192 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583233 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.588039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.588977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.598552 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.670155 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a61a2b-7821-476f-af33-74837a0e2026" path="/var/lib/kubelet/pods/09a61a2b-7821-476f-af33-74837a0e2026/volumes" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.670843 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" path="/var/lib/kubelet/pods/24025591-dced-41d1-bd6d-e8784c0caa3b/volumes" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.749125 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.132851 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.134674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.137830 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.137945 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.138187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.146114 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.183593 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.195312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"auto-csr-approver-29557250-wqt56\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.303073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"auto-csr-approver-29557250-wqt56\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.323981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"auto-csr-approver-29557250-wqt56\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.352293 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerStarted","Data":"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e"} Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.354539 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerStarted","Data":"d40e3fdf8db9cbcc4affa484642d64cee75cff31f6fe4e94fb4c91f6efd99014"} Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.381945 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.381917209 podStartE2EDuration="2.381917209s" podCreationTimestamp="2026-03-13 20:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:00.377013735 +0000 UTC m=+1331.398129636" watchObservedRunningTime="2026-03-13 20:50:00.381917209 +0000 UTC m=+1331.403033110" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.453036 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.904333 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.368405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557250-wqt56" event={"ID":"d00a5fd8-e634-4969-90ad-6850179e7de1","Type":"ContainerStarted","Data":"cb2b864f403d942403e811cc253d4d1b44763bf8f61a8cd36937bc69bd77a8eb"} Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.371215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerStarted","Data":"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525"} Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.371288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerStarted","Data":"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f"} Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.398963 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.398944448 podStartE2EDuration="2.398944448s" podCreationTimestamp="2026-03-13 20:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:01.393132389 +0000 UTC m=+1332.414248290" watchObservedRunningTime="2026-03-13 20:50:01.398944448 +0000 UTC m=+1332.420060339" Mar 13 20:50:02 crc kubenswrapper[4790]: I0313 20:50:02.466745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 20:50:03 crc kubenswrapper[4790]: I0313 20:50:03.395710 4790 generic.go:334] "Generic (PLEG): container finished" podID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerID="a46a82afe76ba100b2ac912d7fb0a03ce75de0a957f3543d9259571fea13e90c" exitCode=0 Mar 13 20:50:03 crc kubenswrapper[4790]: I0313 20:50:03.395786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557250-wqt56" event={"ID":"d00a5fd8-e634-4969-90ad-6850179e7de1","Type":"ContainerDied","Data":"a46a82afe76ba100b2ac912d7fb0a03ce75de0a957f3543d9259571fea13e90c"} Mar 13 20:50:03 crc kubenswrapper[4790]: I0313 20:50:03.716044 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.810518 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.911196 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"d00a5fd8-e634-4969-90ad-6850179e7de1\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.918641 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl" (OuterVolumeSpecName: "kube-api-access-26zzl") pod "d00a5fd8-e634-4969-90ad-6850179e7de1" (UID: "d00a5fd8-e634-4969-90ad-6850179e7de1"). InnerVolumeSpecName "kube-api-access-26zzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.955715 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.955764 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.014083 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.414534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557250-wqt56" event={"ID":"d00a5fd8-e634-4969-90ad-6850179e7de1","Type":"ContainerDied","Data":"cb2b864f403d942403e811cc253d4d1b44763bf8f61a8cd36937bc69bd77a8eb"} Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.414578 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb2b864f403d942403e811cc253d4d1b44763bf8f61a8cd36937bc69bd77a8eb" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.414593 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.626590 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.885616 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.894672 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.956694 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.956915 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" containerID="cri-o://7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8" gracePeriod=30 Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.971629 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.971656 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.425704 4790 generic.go:334] "Generic (PLEG): container finished" podID="b4696d4e-6124-4bcc-b257-651108f6b837" containerID="7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8" exitCode=2 Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.425799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerDied","Data":"7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8"} Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.426069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerDied","Data":"a3ba4dde9b3affbf2de80fd01b6004ec5bcc39b41c69eac7056b983bf5ce8c10"} Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.426087 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ba4dde9b3affbf2de80fd01b6004ec5bcc39b41c69eac7056b983bf5ce8c10" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.448939 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.541833 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"b4696d4e-6124-4bcc-b257-651108f6b837\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.550321 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r" (OuterVolumeSpecName: "kube-api-access-6cc5r") pod "b4696d4e-6124-4bcc-b257-651108f6b837" (UID: "b4696d4e-6124-4bcc-b257-651108f6b837"). InnerVolumeSpecName "kube-api-access-6cc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.645415 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.435207 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.467972 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.479547 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.489619 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: E0313 20:50:07.489998 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerName="oc" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490016 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerName="oc" Mar 13 20:50:07 crc kubenswrapper[4790]: E0313 20:50:07.490029 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490035 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490224 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerName="oc" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490243 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490803 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.493917 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.494279 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.510819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.560669 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.560744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.560798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsd4l\" (UniqueName: \"kubernetes.io/projected/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-api-access-jsd4l\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.561016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662664 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662772 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsd4l\" (UniqueName: \"kubernetes.io/projected/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-api-access-jsd4l\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.666860 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.668412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.671329 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" path="/var/lib/kubelet/pods/7f42b93e-6de8-423c-a2d5-dd57885de32c/volumes" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.672343 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" path="/var/lib/kubelet/pods/b4696d4e-6124-4bcc-b257-651108f6b837/volumes" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.674324 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.680599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsd4l\" (UniqueName: \"kubernetes.io/projected/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-api-access-jsd4l\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.805932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.864562 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866326 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" containerID="cri-o://f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" gracePeriod=30 Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866838 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" containerID="cri-o://986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" gracePeriod=30 Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866905 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" containerID="cri-o://8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" gracePeriod=30 Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866949 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" containerID="cri-o://dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" gracePeriod=30 Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.437805 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452026 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" exitCode=0 Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452061 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" exitCode=2 Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46"} Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452103 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922"} Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.716752 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.742626 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.462229 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ae1ef11-086d-4d65-bfcb-987f3973fdc5","Type":"ContainerStarted","Data":"83cf2b18dc8eded2ca99b94c1507041268bc933f2620be8caa37cd148556a6c4"} Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.462557 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.462571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ae1ef11-086d-4d65-bfcb-987f3973fdc5","Type":"ContainerStarted","Data":"ba0a53291f6b5fed52f31c4c8977792a4c7df319c61a49ca605419e58bc73c3a"} Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.465473 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" exitCode=0 Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.465542 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603"} Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.489863 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.128949849 podStartE2EDuration="2.489842682s" podCreationTimestamp="2026-03-13 20:50:07 +0000 UTC" firstStartedPulling="2026-03-13 20:50:08.440927274 +0000 UTC m=+1339.462043165" lastFinishedPulling="2026-03-13 20:50:08.801820107 +0000 UTC m=+1339.822935998" observedRunningTime="2026-03-13 20:50:09.484182898 +0000 UTC m=+1340.505298789" watchObservedRunningTime="2026-03-13 20:50:09.489842682 +0000 UTC m=+1340.510958573" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.505730 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.750617 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.750672 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:10 crc kubenswrapper[4790]: I0313 20:50:10.832543 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:10 crc kubenswrapper[4790]: I0313 20:50:10.832689 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.475237 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483476 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" exitCode=0 Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483538 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6"} Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe"} Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483574 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483667 4790 scope.go:117] "RemoveContainer" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.507025 4790 scope.go:117] "RemoveContainer" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.528066 4790 scope.go:117] "RemoveContainer" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.552521 4790 scope.go:117] "RemoveContainer" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.562899 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.562967 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.562992 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563076 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563156 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563203 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563272 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.564172 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.567697 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.573784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth" (OuterVolumeSpecName: "kube-api-access-44qth") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "kube-api-access-44qth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.582545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts" (OuterVolumeSpecName: "scripts") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.613974 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.639596 4790 scope.go:117] "RemoveContainer" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.660802 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46\": container with ID starting with 986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46 not found: ID does not exist" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.660854 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46"} err="failed to get container status \"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46\": rpc error: code = NotFound desc = could not find container \"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46\": container with ID starting with 986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.660879 4790 scope.go:117] "RemoveContainer" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667804 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667841 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667849 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667859 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667868 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.668869 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922\": container with ID starting with 8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922 not found: ID does not exist" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.668910 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922"} err="failed to get container status \"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922\": rpc error: code = NotFound desc = could not find container \"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922\": container with ID starting with 8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.668935 4790 scope.go:117] "RemoveContainer" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.669204 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6\": container with ID starting with dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6 not found: ID does not exist" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.669227 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6"} err="failed to get container status \"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6\": rpc error: code = NotFound desc = could not find container \"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6\": container with ID starting with dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.669241 4790 scope.go:117] "RemoveContainer" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.669443 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603\": container with ID starting with f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603 not found: ID does not exist" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.669461 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603"} err="failed to get container status \"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603\": rpc error: code = NotFound desc = could not find container \"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603\": container with ID starting with f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.727571 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.765758 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data" (OuterVolumeSpecName: "config-data") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.770233 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.770274 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.824307 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.844630 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.853586 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854145 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854171 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854192 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854200 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854221 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854229 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854258 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854266 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854521 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854547 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854568 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854582 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.856599 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.860979 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.861301 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.869024 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.880755 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.974832 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.974992 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975253 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975293 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975354 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077705 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077791 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.078067 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.078150 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.078355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082673 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082778 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.083087 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.101295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.174905 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.675771 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.956450 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.956625 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:50:13 crc kubenswrapper[4790]: I0313 20:50:13.515492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7"} Mar 13 20:50:13 crc kubenswrapper[4790]: I0313 20:50:13.516074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"437f52cb36b41910903372ec5bcd7008b8c1ad39f31664517f1ae136ab440e48"} Mar 13 20:50:13 crc kubenswrapper[4790]: I0313 20:50:13.670247 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" path="/var/lib/kubelet/pods/31f1d628-34fa-4e75-8aa8-f3e724839ee8/volumes" Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.526905 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e"} Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.961519 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.965843 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.966178 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:50:15 crc kubenswrapper[4790]: I0313 20:50:15.537069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688"} Mar 13 20:50:15 crc kubenswrapper[4790]: I0313 20:50:15.551343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:50:17 crc kubenswrapper[4790]: W0313 20:50:17.125251 4790 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd00a5fd8_e634_4969_90ad_6850179e7de1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd00a5fd8_e634_4969_90ad_6850179e7de1.slice: no such file or directory Mar 13 20:50:17 crc kubenswrapper[4790]: E0313 20:50:17.342861 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-conmon-bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice/crio-conmon-1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice/crio-1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4696d4e_6124_4bcc_b257_651108f6b837.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice/crio-703a1bf7672ad738d5a0561a4b2308100e00dc344ee885923cb3275bce620370\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255451e0_9cb8_424f_a327_6e7ef4e4d775.slice/crio-e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4696d4e_6124_4bcc_b257_651108f6b837.slice/crio-7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7082b53_1345_4c47_a9bf_b87d9e1fd3ca.slice/crio-883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255451e0_9cb8_424f_a327_6e7ef4e4d775.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4696d4e_6124_4bcc_b257_651108f6b837.slice/crio-conmon-7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7082b53_1345_4c47_a9bf_b87d9e1fd3ca.slice/crio-conmon-883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76.scope\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.439967 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.483016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.483198 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.483299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.488897 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs" (OuterVolumeSpecName: "kube-api-access-8rsbs") pod "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" (UID: "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca"). InnerVolumeSpecName "kube-api-access-8rsbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.513192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data" (OuterVolumeSpecName: "config-data") pod "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" (UID: "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.517089 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" (UID: "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.555859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678"} Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.557190 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.559174 4790 generic.go:334] "Generic (PLEG): container finished" podID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" exitCode=137 Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.559642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.561046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerDied","Data":"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76"} Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.561116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerDied","Data":"51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd"} Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.561151 4790 scope.go:117] "RemoveContainer" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.589173 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.589214 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.589225 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.592221 4790 scope.go:117] "RemoveContainer" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" Mar 13 20:50:17 crc kubenswrapper[4790]: E0313 20:50:17.592596 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76\": container with ID starting with 883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76 not found: ID does not exist" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.592641 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76"} err="failed to get container status \"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76\": rpc error: code = NotFound desc = could not find container \"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76\": container with ID starting with 883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76 not found: ID does not exist" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.604150 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.337494835 podStartE2EDuration="6.604128496s" podCreationTimestamp="2026-03-13 20:50:11 +0000 UTC" firstStartedPulling="2026-03-13 20:50:12.67442427 +0000 UTC m=+1343.695540161" lastFinishedPulling="2026-03-13 20:50:16.941057931 +0000 UTC m=+1347.962173822" observedRunningTime="2026-03-13 20:50:17.593480475 +0000 UTC m=+1348.614596376" watchObservedRunningTime="2026-03-13 20:50:17.604128496 +0000 UTC m=+1348.625244397" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.616337 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.623899 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.643772 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: E0313 20:50:17.644200 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.644221 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.644441 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.645002 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.647421 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.647770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.648665 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.658543 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.699207 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" path="/var/lib/kubelet/pods/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca/volumes" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.749336 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.749772 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.796775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.796974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.797027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.797061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.797115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plzs\" (UniqueName: \"kubernetes.io/projected/20c0842a-c69a-4af0-aef0-ffec3f3560bc-kube-api-access-8plzs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.817520 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.898807 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.898868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plzs\" (UniqueName: \"kubernetes.io/projected/20c0842a-c69a-4af0-aef0-ffec3f3560bc-kube-api-access-8plzs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.898949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.899035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.899059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.903813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.904605 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.908148 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.915505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.919230 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plzs\" (UniqueName: \"kubernetes.io/projected/20c0842a-c69a-4af0-aef0-ffec3f3560bc-kube-api-access-8plzs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.976371 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:18 crc kubenswrapper[4790]: I0313 20:50:18.418569 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:18 crc kubenswrapper[4790]: I0313 20:50:18.569057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20c0842a-c69a-4af0-aef0-ffec3f3560bc","Type":"ContainerStarted","Data":"f4861d9a1f81670a88d5e3a86fe1788bb4d44171315713afe4aacb565a42102e"} Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.580281 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20c0842a-c69a-4af0-aef0-ffec3f3560bc","Type":"ContainerStarted","Data":"649f54ab6c2e84e74d811136c5f4c779bf01a09a300962f5507f0c484fdeb533"} Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.614339 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.614320813 podStartE2EDuration="2.614320813s" podCreationTimestamp="2026-03-13 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:19.604020762 +0000 UTC m=+1350.625136653" watchObservedRunningTime="2026-03-13 20:50:19.614320813 +0000 UTC m=+1350.635436704" Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.754181 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.757508 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.761766 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.593422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.824947 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.826782 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.855882 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968630 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968837 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968867 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070306 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070429 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071525 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071545 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071926 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.089239 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.153155 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.649313 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:50:21 crc kubenswrapper[4790]: W0313 20:50:21.650946 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ea3d76_1bca_44e8_986c_8e751336f93d.slice/crio-b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100 WatchSource:0}: Error finding container b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100: Status 404 returned error can't find the container with id b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100 Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.607008 4790 generic.go:334] "Generic (PLEG): container finished" podID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerID="3bf4c1a3a8959712b6bdc6bb2a33893090891a1211e7646c25c1b2fcadfa4181" exitCode=0 Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.607248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerDied","Data":"3bf4c1a3a8959712b6bdc6bb2a33893090891a1211e7646c25c1b2fcadfa4181"} Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.607702 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerStarted","Data":"b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100"} Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.977529 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.162811 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.619577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerStarted","Data":"4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29"} Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.619664 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" containerID="cri-o://f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.619961 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" containerID="cri-o://3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.620234 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.654858 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" podStartSLOduration=3.654838816 podStartE2EDuration="3.654838816s" podCreationTimestamp="2026-03-13 20:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:23.645517002 +0000 UTC m=+1354.666632893" watchObservedRunningTime="2026-03-13 20:50:23.654838816 +0000 UTC m=+1354.675954707" Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.731901 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.732227 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" containerID="cri-o://e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.734257 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" containerID="cri-o://99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.734568 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" containerID="cri-o://48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.735227 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" containerID="cri-o://5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e" gracePeriod=30 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.633412 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" exitCode=143 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.633762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerDied","Data":"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643097 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678" exitCode=0 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643130 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688" exitCode=2 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643140 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e" exitCode=0 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643148 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7" exitCode=0 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.924835 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981503 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981905 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981985 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.982026 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.982057 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.982084 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.983477 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.983546 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.988327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz" (OuterVolumeSpecName: "kube-api-access-fq8fz") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "kube-api-access-fq8fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.999811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts" (OuterVolumeSpecName: "scripts") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.021031 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.063510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084295 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084344 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084357 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084393 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084403 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084412 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.097979 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.119481 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data" (OuterVolumeSpecName: "config-data") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.186059 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.186101 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.656955 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"437f52cb36b41910903372ec5bcd7008b8c1ad39f31664517f1ae136ab440e48"} Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.657033 4790 scope.go:117] "RemoveContainer" containerID="99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.657048 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.698250 4790 scope.go:117] "RemoveContainer" containerID="48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.698418 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.708433 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.744559 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745081 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745104 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745124 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745134 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745146 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745153 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745173 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745180 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745428 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745455 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745470 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745485 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.747541 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.749497 4790 scope.go:117] "RemoveContainer" containerID="5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.752246 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.755242 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.755405 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.757006 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.795761 4790 scope.go:117] "RemoveContainer" containerID="e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811171 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811469 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.913646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.913712 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.913872 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914847 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914962 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.915328 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.915355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.919726 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.920076 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.920128 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.921753 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.925724 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.934437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.053855 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.054577 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.492074 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:26 crc kubenswrapper[4790]: W0313 20:50:26.493361 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8215d8_8b4d_4c20_a832_e2088825019b.slice/crio-18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c WatchSource:0}: Error finding container 18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c: Status 404 returned error can't find the container with id 18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.667462 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.381065 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588598 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588794 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.591388 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs" (OuterVolumeSpecName: "logs") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.597094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb" (OuterVolumeSpecName: "kube-api-access-7mgtb") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "kube-api-access-7mgtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.618562 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.621010 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.627210 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data" (OuterVolumeSpecName: "config-data") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.677400 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" path="/var/lib/kubelet/pods/83ebcf30-733f-4074-9565-5582a160a8c3/volumes" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.680361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684121 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" exitCode=0 Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerDied","Data":"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerDied","Data":"d40e3fdf8db9cbcc4affa484642d64cee75cff31f6fe4e94fb4c91f6efd99014"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684201 4790 scope.go:117] "RemoveContainer" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684336 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692201 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692233 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692248 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692261 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.714601 4790 scope.go:117] "RemoveContainer" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.723347 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.741281 4790 scope.go:117] "RemoveContainer" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.742232 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525\": container with ID starting with 3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525 not found: ID does not exist" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.742291 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525"} err="failed to get container status \"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525\": rpc error: code = NotFound desc = could not find container \"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525\": container with ID starting with 3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525 not found: ID does not exist" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.742327 4790 scope.go:117] "RemoveContainer" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.742869 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f\": container with ID starting with f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f not found: ID does not exist" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.742918 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f"} err="failed to get container status \"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f\": rpc error: code = NotFound desc = could not find container \"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f\": container with ID starting with f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f not found: ID does not exist" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.745180 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.765443 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.766020 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766039 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.766058 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766066 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766297 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766331 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.767591 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.769719 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.771542 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.771866 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.777226 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793835 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.794418 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.895957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896007 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896089 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.897449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.900085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.900651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.900717 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.905710 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.912548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.976912 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.013099 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.094851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.535518 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:28 crc kubenswrapper[4790]: W0313 20:50:28.535955 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae43d767_425b_46ff_ba98_cc3dc9419ba5.slice/crio-edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82 WatchSource:0}: Error finding container edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82: Status 404 returned error can't find the container with id edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82 Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.697149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009"} Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.700269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerStarted","Data":"edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82"} Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.717963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.990302 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.991476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.995155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.995597 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.004828 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020828 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.122873 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.123212 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.123250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.123884 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.126695 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.127455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.127892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.139693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.322099 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.697794 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" path="/var/lib/kubelet/pods/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8/volumes" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.733066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5"} Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.737866 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerStarted","Data":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.737948 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerStarted","Data":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.816459 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.816436681 podStartE2EDuration="2.816436681s" podCreationTimestamp="2026-03-13 20:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:29.766701854 +0000 UTC m=+1360.787817745" watchObservedRunningTime="2026-03-13 20:50:29.816436681 +0000 UTC m=+1360.837552562" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.819816 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 20:50:30 crc kubenswrapper[4790]: I0313 20:50:30.748556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerStarted","Data":"4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570"} Mar 13 20:50:30 crc kubenswrapper[4790]: I0313 20:50:30.748885 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerStarted","Data":"4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896"} Mar 13 20:50:30 crc kubenswrapper[4790]: I0313 20:50:30.769116 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-sw4k5" podStartSLOduration=2.769098814 podStartE2EDuration="2.769098814s" podCreationTimestamp="2026-03-13 20:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:30.760657274 +0000 UTC m=+1361.781773165" watchObservedRunningTime="2026-03-13 20:50:30.769098814 +0000 UTC m=+1361.790214705" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.155756 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.282801 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.283512 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" containerID="cri-o://31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f" gracePeriod=10 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.773153 4790 generic.go:334] "Generic (PLEG): container finished" podID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerID="31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f" exitCode=0 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.773389 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerDied","Data":"31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f"} Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.791532 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8"} Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.791754 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" containerID="cri-o://684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.791856 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" containerID="cri-o://7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.792021 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" containerID="cri-o://4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.792059 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" containerID="cri-o://32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.792110 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.827499 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.004437636 podStartE2EDuration="6.827473261s" podCreationTimestamp="2026-03-13 20:50:25 +0000 UTC" firstStartedPulling="2026-03-13 20:50:26.495046682 +0000 UTC m=+1357.516162573" lastFinishedPulling="2026-03-13 20:50:31.318082307 +0000 UTC m=+1362.339198198" observedRunningTime="2026-03-13 20:50:31.819097602 +0000 UTC m=+1362.840213503" watchObservedRunningTime="2026-03-13 20:50:31.827473261 +0000 UTC m=+1362.848589172" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.961433 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.996882 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.997007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.054611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.055193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099110 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099578 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099715 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.100305 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.100330 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.102955 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj" (OuterVolumeSpecName: "kube-api-access-lxmdj") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "kube-api-access-lxmdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.148394 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.158572 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config" (OuterVolumeSpecName: "config") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.171255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202341 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202394 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202405 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202416 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.802499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerDied","Data":"d410e2281728cc5d35324b5d9753eac6a283696daed20ea1b6c874c3b410e22c"} Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.802570 4790 scope.go:117] "RemoveContainer" containerID="31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.802745 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807235 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5" exitCode=2 Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807265 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009" exitCode=0 Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5"} Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009"} Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.830054 4790 scope.go:117] "RemoveContainer" containerID="62406a3417f49cd6fee467ec15aafed59672de36ebec3945dba28321522a57f0" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.870263 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.882191 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:50:33 crc kubenswrapper[4790]: I0313 20:50:33.685360 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" path="/var/lib/kubelet/pods/aa96d2ec-af8f-4ef3-96a2-108e003c669b/volumes" Mar 13 20:50:33 crc kubenswrapper[4790]: I0313 20:50:33.818682 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69" exitCode=0 Mar 13 20:50:33 crc kubenswrapper[4790]: I0313 20:50:33.818771 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69"} Mar 13 20:50:35 crc kubenswrapper[4790]: I0313 20:50:35.840977 4790 generic.go:334] "Generic (PLEG): container finished" podID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerID="4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570" exitCode=0 Mar 13 20:50:35 crc kubenswrapper[4790]: I0313 20:50:35.841034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerDied","Data":"4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570"} Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.198729 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301072 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301171 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301217 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301282 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.306781 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts" (OuterVolumeSpecName: "scripts") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.307061 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt" (OuterVolumeSpecName: "kube-api-access-p2jlt") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "kube-api-access-p2jlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.327099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data" (OuterVolumeSpecName: "config-data") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.332879 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403409 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403461 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403477 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403492 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: E0313 20:50:37.826737 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod263e3744_6b98_4d91_aba2_cd28a616d9df.slice/crio-4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod263e3744_6b98_4d91_aba2_cd28a616d9df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.858750 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerDied","Data":"4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896"} Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.859157 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.858828 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.038980 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.039260 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" containerID="cri-o://94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.050799 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.051062 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" containerID="cri-o://7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.051143 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" containerID="cri-o://fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.124114 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.124433 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" containerID="cri-o://8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.124567 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" containerID="cri-o://ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.632141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.719029 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.721870 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.724130 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.724205 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.743298 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.743396 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744063 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744141 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.745205 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs" (OuterVolumeSpecName: "logs") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.749034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh" (OuterVolumeSpecName: "kube-api-access-xxzbh") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "kube-api-access-xxzbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.777653 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.778489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data" (OuterVolumeSpecName: "config-data") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.799549 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.818857 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847323 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847693 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847754 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847804 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847868 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847919 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.870351 4790 generic.go:334] "Generic (PLEG): container finished" podID="b6868acd-5476-49b4-958c-8f68fde161b9" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" exitCode=143 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.870456 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerDied","Data":"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873815 4790 generic.go:334] "Generic (PLEG): container finished" podID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" exitCode=0 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873907 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerDied","Data":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerDied","Data":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873901 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.874023 4790 scope.go:117] "RemoveContainer" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873940 4790 generic.go:334] "Generic (PLEG): container finished" podID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" exitCode=143 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.874242 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerDied","Data":"edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.908738 4790 scope.go:117] "RemoveContainer" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.919051 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.929444 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.939759 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.940456 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="init" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.940550 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="init" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.940642 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerName="nova-manage" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.940724 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerName="nova-manage" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.940806 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.940910 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.941009 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941077 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.941149 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941216 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941541 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941623 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerName="nova-manage" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941695 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941760 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.942807 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.945618 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.947333 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.949875 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.951960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.972029 4790 scope.go:117] "RemoveContainer" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.974790 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": container with ID starting with fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11 not found: ID does not exist" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.974878 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} err="failed to get container status \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": rpc error: code = NotFound desc = could not find container \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": container with ID starting with fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11 not found: ID does not exist" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.974918 4790 scope.go:117] "RemoveContainer" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.975294 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": container with ID starting with 7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431 not found: ID does not exist" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.975318 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} err="failed to get container status \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": rpc error: code = NotFound desc = could not find container \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": container with ID starting with 7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431 not found: ID does not exist" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.975332 4790 scope.go:117] "RemoveContainer" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.976211 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} err="failed to get container status \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": rpc error: code = NotFound desc = could not find container \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": container with ID starting with fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11 not found: ID does not exist" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.976369 4790 scope.go:117] "RemoveContainer" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.980576 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} err="failed to get container status \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": rpc error: code = NotFound desc = could not find container \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": container with ID starting with 7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431 not found: ID does not exist" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.051886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-config-data\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-logs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052760 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-public-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-445zd\" (UniqueName: \"kubernetes.io/projected/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-kube-api-access-445zd\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.155875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-config-data\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.155968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156013 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156108 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-logs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-public-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156207 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-445zd\" (UniqueName: \"kubernetes.io/projected/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-kube-api-access-445zd\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156991 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-logs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.160281 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-config-data\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.160977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-public-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.162963 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.164775 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.177598 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-445zd\" (UniqueName: \"kubernetes.io/projected/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-kube-api-access-445zd\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.268141 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.670627 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" path="/var/lib/kubelet/pods/ae43d767-425b-46ff-ba98-cc3dc9419ba5/volumes" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.706189 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.901622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4597d91c-0f9f-4e33-aaa7-b25e7076e13a","Type":"ContainerStarted","Data":"3f30e0061e280a401241f19ee12e494cd4b030bd562acd0b72ded4501c35c83e"} Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.901683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4597d91c-0f9f-4e33-aaa7-b25e7076e13a","Type":"ContainerStarted","Data":"ff0150884b11d59c283f22874c57891380471d45835859210f6d8343f899227b"} Mar 13 20:50:40 crc kubenswrapper[4790]: I0313 20:50:40.915459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4597d91c-0f9f-4e33-aaa7-b25e7076e13a","Type":"ContainerStarted","Data":"66096be8201acafa7bf92ee6f89be2e7ad60634981a3b1ad6019d36671094cb8"} Mar 13 20:50:40 crc kubenswrapper[4790]: I0313 20:50:40.941942 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.941919523 podStartE2EDuration="2.941919523s" podCreationTimestamp="2026-03-13 20:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:40.932968978 +0000 UTC m=+1371.954084869" watchObservedRunningTime="2026-03-13 20:50:40.941919523 +0000 UTC m=+1371.963035414" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.810976 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911205 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911258 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911285 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911445 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911549 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911839 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs" (OuterVolumeSpecName: "logs") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.912241 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.916579 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4" (OuterVolumeSpecName: "kube-api-access-g45p4") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "kube-api-access-g45p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.926875 4790 generic.go:334] "Generic (PLEG): container finished" podID="b6868acd-5476-49b4-958c-8f68fde161b9" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" exitCode=0 Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.926959 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerDied","Data":"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498"} Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.926988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerDied","Data":"1998e9c0ed2f49c51df1fb979275385f3c3c928b8ffbba368fee9881d45e3a34"} Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.927006 4790 scope.go:117] "RemoveContainer" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.927043 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.937715 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data" (OuterVolumeSpecName: "config-data") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.953564 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.973553 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016687 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016752 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016765 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016777 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.033826 4790 scope.go:117] "RemoveContainer" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.054545 4790 scope.go:117] "RemoveContainer" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.054969 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498\": container with ID starting with ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498 not found: ID does not exist" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.055033 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498"} err="failed to get container status \"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498\": rpc error: code = NotFound desc = could not find container \"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498\": container with ID starting with ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498 not found: ID does not exist" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.055064 4790 scope.go:117] "RemoveContainer" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.055335 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67\": container with ID starting with 8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67 not found: ID does not exist" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.055385 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67"} err="failed to get container status \"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67\": rpc error: code = NotFound desc = could not find container \"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67\": container with ID starting with 8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67 not found: ID does not exist" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.256844 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.306192 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.313334 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.313967 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.313993 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.314023 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.314032 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.314294 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.314319 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.315469 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.318742 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.318815 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.332564 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.423987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b43558-bdf4-45e4-b1bc-6e9b325e163b-logs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424367 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-config-data\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424530 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r52ms\" (UniqueName: \"kubernetes.io/projected/00b43558-bdf4-45e4-b1bc-6e9b325e163b-kube-api-access-r52ms\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b43558-bdf4-45e4-b1bc-6e9b325e163b-logs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525833 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525883 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-config-data\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525913 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525933 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r52ms\" (UniqueName: \"kubernetes.io/projected/00b43558-bdf4-45e4-b1bc-6e9b325e163b-kube-api-access-r52ms\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.526949 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b43558-bdf4-45e4-b1bc-6e9b325e163b-logs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.529992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-config-data\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.530162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.530699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.543800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r52ms\" (UniqueName: \"kubernetes.io/projected/00b43558-bdf4-45e4-b1bc-6e9b325e163b-kube-api-access-r52ms\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.638559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.754082 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.834863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.834930 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.835005 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.843118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd" (OuterVolumeSpecName: "kube-api-access-x7pzd") pod "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" (UID: "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4"). InnerVolumeSpecName "kube-api-access-x7pzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.868854 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data" (OuterVolumeSpecName: "config-data") pod "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" (UID: "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.871880 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" (UID: "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.937061 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.937100 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.937112 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938316 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" exitCode=0 Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerDied","Data":"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e"} Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerDied","Data":"6e7ed25e629647fdcbcdadd57d668b2cdfbe95f3b450a5732f265251e324ddb9"} Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938417 4790 scope.go:117] "RemoveContainer" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938536 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.973811 4790 scope.go:117] "RemoveContainer" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.974221 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e\": container with ID starting with 94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e not found: ID does not exist" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.974250 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e"} err="failed to get container status \"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e\": rpc error: code = NotFound desc = could not find container \"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e\": container with ID starting with 94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e not found: ID does not exist" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.978241 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.988063 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.002356 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: E0313 20:50:43.002790 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.002803 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.002992 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.003745 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.006111 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.011567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.076569 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.139732 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqk4\" (UniqueName: \"kubernetes.io/projected/01e86425-f126-4827-b727-e8c73d152aa6-kube-api-access-5jqk4\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.139785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-config-data\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.139820 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.241179 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqk4\" (UniqueName: \"kubernetes.io/projected/01e86425-f126-4827-b727-e8c73d152aa6-kube-api-access-5jqk4\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.241540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-config-data\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.241591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.245266 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.245710 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-config-data\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.261694 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqk4\" (UniqueName: \"kubernetes.io/projected/01e86425-f126-4827-b727-e8c73d152aa6-kube-api-access-5jqk4\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.321650 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.670515 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" path="/var/lib/kubelet/pods/b6868acd-5476-49b4-958c-8f68fde161b9/volumes" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.671505 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" path="/var/lib/kubelet/pods/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4/volumes" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.801555 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.947473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01e86425-f126-4827-b727-e8c73d152aa6","Type":"ContainerStarted","Data":"19b8700ac8cce409d32085026c5d7ffcaf0b8e22f583b1a8ade9d67819c332dd"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.949549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00b43558-bdf4-45e4-b1bc-6e9b325e163b","Type":"ContainerStarted","Data":"e5f1dcb2f34dfe470d97b1c12abcb44998274cb922a1ab2d2504d7e0b8c6bc9e"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.949572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00b43558-bdf4-45e4-b1bc-6e9b325e163b","Type":"ContainerStarted","Data":"9a99e76dfaa85feb5f4523b3781167ab6fc1367c7b8e8e94f4e06e4e02c29d88"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.949582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00b43558-bdf4-45e4-b1bc-6e9b325e163b","Type":"ContainerStarted","Data":"6f9eb52c8c676828f1c30311e3603b02d4889a09c5d60400352372d8d2e38285"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.972108 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9720813069999998 podStartE2EDuration="1.972081307s" podCreationTimestamp="2026-03-13 20:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:43.967189375 +0000 UTC m=+1374.988305276" watchObservedRunningTime="2026-03-13 20:50:43.972081307 +0000 UTC m=+1374.993197188" Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.015410 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.015472 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.973746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01e86425-f126-4827-b727-e8c73d152aa6","Type":"ContainerStarted","Data":"c6dd172d17edfa4d9ed750f7157dd649bef6a98b28de8ea3e32b681e355e3615"} Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.993117 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.993101106 podStartE2EDuration="2.993101106s" podCreationTimestamp="2026-03-13 20:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:44.989664312 +0000 UTC m=+1376.010780213" watchObservedRunningTime="2026-03-13 20:50:44.993101106 +0000 UTC m=+1376.014216997" Mar 13 20:50:48 crc kubenswrapper[4790]: E0313 20:50:48.051001 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:48 crc kubenswrapper[4790]: I0313 20:50:48.322794 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 20:50:49 crc kubenswrapper[4790]: I0313 20:50:49.268657 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:49 crc kubenswrapper[4790]: I0313 20:50:49.268733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:50 crc kubenswrapper[4790]: I0313 20:50:50.280588 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4597d91c-0f9f-4e33-aaa7-b25e7076e13a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:50 crc kubenswrapper[4790]: I0313 20:50:50.280774 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4597d91c-0f9f-4e33-aaa7-b25e7076e13a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:52 crc kubenswrapper[4790]: I0313 20:50:52.639098 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:52 crc kubenswrapper[4790]: I0313 20:50:52.639193 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:52 crc kubenswrapper[4790]: I0313 20:50:52.707176 4790 scope.go:117] "RemoveContainer" containerID="721d15acd59eb0b2b9f8d48eaa51f02f0b2b5cc626d1243f5a398968f008ce5a" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.322168 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.357259 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.652615 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="00b43558-bdf4-45e4-b1bc-6e9b325e163b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.653059 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="00b43558-bdf4-45e4-b1bc-6e9b325e163b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:54 crc kubenswrapper[4790]: I0313 20:50:54.088991 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 20:50:56 crc kubenswrapper[4790]: I0313 20:50:56.064893 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 20:50:57 crc kubenswrapper[4790]: I0313 20:50:57.268421 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:57 crc kubenswrapper[4790]: I0313 20:50:57.268480 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:57 crc kubenswrapper[4790]: I0313 20:50:57.695779 4790 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode5f74b87-8c4a-490f-ad9c-75ba17e3a1a8] : Timed out while waiting for systemd to remove kubepods-besteffort-pode5f74b87_8c4a_490f_ad9c_75ba17e3a1a8.slice" Mar 13 20:50:59 crc kubenswrapper[4790]: I0313 20:50:59.274739 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:59 crc kubenswrapper[4790]: I0313 20:50:59.277005 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:59 crc kubenswrapper[4790]: I0313 20:50:59.283416 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:51:00 crc kubenswrapper[4790]: I0313 20:51:00.116256 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:51:00 crc kubenswrapper[4790]: I0313 20:51:00.639749 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:51:00 crc kubenswrapper[4790]: I0313 20:51:00.640151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129695 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8" exitCode=137 Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8"} Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c"} Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129975 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.181304 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244754 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244805 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244844 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244883 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.246208 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.246322 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.255714 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9" (OuterVolumeSpecName: "kube-api-access-xq5f9") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "kube-api-access-xq5f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.256538 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts" (OuterVolumeSpecName: "scripts") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.281243 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.294363 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.318620 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.343485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data" (OuterVolumeSpecName: "config-data") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347916 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347951 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347965 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347977 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347987 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347998 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.348008 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.348019 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.646630 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.650220 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.656676 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.137642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.142993 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.187429 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.212537 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.227530 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228073 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228097 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228122 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228132 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228154 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228163 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228186 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228195 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228418 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228451 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228469 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228491 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.230418 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.235680 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.236208 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.236531 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.243312 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367109 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-scripts\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367461 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-run-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-log-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367527 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-config-data\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367544 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367578 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv28v\" (UniqueName: \"kubernetes.io/projected/d2645f50-482e-487d-9b16-c2a066630480-kube-api-access-fv28v\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-run-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469360 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-log-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-config-data\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469505 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv28v\" (UniqueName: \"kubernetes.io/projected/d2645f50-482e-487d-9b16-c2a066630480-kube-api-access-fv28v\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469619 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-scripts\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.470020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-log-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.470029 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-run-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.475162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-scripts\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.475255 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.475706 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.478117 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.484001 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-config-data\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.490776 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv28v\" (UniqueName: \"kubernetes.io/projected/d2645f50-482e-487d-9b16-c2a066630480-kube-api-access-fv28v\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.554548 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.680953 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" path="/var/lib/kubelet/pods/dd8215d8-8b4d-4c20-a832-e2088825019b/volumes" Mar 13 20:51:04 crc kubenswrapper[4790]: W0313 20:51:04.019024 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2645f50_482e_487d_9b16_c2a066630480.slice/crio-7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c WatchSource:0}: Error finding container 7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c: Status 404 returned error can't find the container with id 7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c Mar 13 20:51:04 crc kubenswrapper[4790]: I0313 20:51:04.028451 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:04 crc kubenswrapper[4790]: I0313 20:51:04.148792 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c"} Mar 13 20:51:05 crc kubenswrapper[4790]: I0313 20:51:05.158648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"feffa8201208f291d13abd35b2c2dd546d70c74d9baf779e596335b07c113551"} Mar 13 20:51:06 crc kubenswrapper[4790]: I0313 20:51:06.167917 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"ea74cefcade4fe49ce3766e5d583e56536dfa46f88ac3f98686e4aaac580a73a"} Mar 13 20:51:06 crc kubenswrapper[4790]: I0313 20:51:06.169219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"c30e2102628823eab1ee3054424c04fbd3251b4b6386adcb17330147bbd91bd2"} Mar 13 20:51:09 crc kubenswrapper[4790]: I0313 20:51:09.209859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"d6f7a91d9497dc21a044074639ef6035928f9b7d66af07297fc5f6dc5a406499"} Mar 13 20:51:09 crc kubenswrapper[4790]: I0313 20:51:09.210414 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:51:09 crc kubenswrapper[4790]: I0313 20:51:09.245076 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.041921117 podStartE2EDuration="6.245055046s" podCreationTimestamp="2026-03-13 20:51:03 +0000 UTC" firstStartedPulling="2026-03-13 20:51:04.025018692 +0000 UTC m=+1395.046134583" lastFinishedPulling="2026-03-13 20:51:08.228152621 +0000 UTC m=+1399.249268512" observedRunningTime="2026-03-13 20:51:09.228423722 +0000 UTC m=+1400.249539633" watchObservedRunningTime="2026-03-13 20:51:09.245055046 +0000 UTC m=+1400.266170937" Mar 13 20:51:14 crc kubenswrapper[4790]: I0313 20:51:14.015692 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:51:14 crc kubenswrapper[4790]: I0313 20:51:14.016331 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:51:33 crc kubenswrapper[4790]: I0313 20:51:33.561784 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 20:51:43 crc kubenswrapper[4790]: I0313 20:51:43.007419 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.015827 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016138 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016183 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016915 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016992 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb" gracePeriod=600 Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.073169 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545285 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb" exitCode=0 Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb"} Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e"} Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545514 4790 scope.go:117] "RemoveContainer" containerID="232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c" Mar 13 20:51:47 crc kubenswrapper[4790]: I0313 20:51:47.749329 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" containerID="cri-o://b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" gracePeriod=604796 Mar 13 20:51:48 crc kubenswrapper[4790]: I0313 20:51:48.352953 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" containerID="cri-o://9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4" gracePeriod=604796 Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.308530 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.348886 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.348947 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349095 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349232 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349298 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.355707 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q" (OuterVolumeSpecName: "kube-api-access-2zf2q") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "kube-api-access-2zf2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.355938 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.356428 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.356434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.356975 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.357231 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.362127 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info" (OuterVolumeSpecName: "pod-info") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.367011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.383267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data" (OuterVolumeSpecName: "config-data") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.424255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf" (OuterVolumeSpecName: "server-conf") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452799 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452833 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452842 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452853 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452861 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452869 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452877 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452884 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452892 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452920 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.458326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.481160 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.554275 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.554312 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855111 4790 generic.go:334] "Generic (PLEG): container finished" podID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" exitCode=0 Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855203 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855231 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerDied","Data":"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e"} Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerDied","Data":"6218f617d211db14656d09a088c6de02a6677348fa07bdf9d142d99af0111ad7"} Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855568 4790 scope.go:117] "RemoveContainer" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.857928 4790 generic.go:334] "Generic (PLEG): container finished" podID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerID="9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4" exitCode=0 Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.857980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerDied","Data":"9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4"} Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.892675 4790 scope.go:117] "RemoveContainer" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.900603 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.919315 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.940292 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.940780 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.940797 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.940854 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="setup-container" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.940863 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="setup-container" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.941062 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.942314 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.950818 4790 scope.go:117] "RemoveContainer" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.954941 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.956538 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e\": container with ID starting with b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e not found: ID does not exist" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.956583 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e"} err="failed to get container status \"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e\": rpc error: code = NotFound desc = could not find container \"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e\": container with ID starting with b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e not found: ID does not exist" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.956612 4790 scope.go:117] "RemoveContainer" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957634 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957794 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957927 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bssvd" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.958088 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.958114 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.971776 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde\": container with ID starting with e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde not found: ID does not exist" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.971841 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde"} err="failed to get container status \"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde\": rpc error: code = NotFound desc = could not find container \"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde\": container with ID starting with e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde not found: ID does not exist" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.976890 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063783 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72ed8a4f-a46a-4e41-9335-f10dc6338627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72ed8a4f-a46a-4e41-9335-f10dc6338627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063938 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjqk\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-kube-api-access-ctjqk\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-config-data\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064143 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064170 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064197 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165164 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165529 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165608 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72ed8a4f-a46a-4e41-9335-f10dc6338627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165633 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165678 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72ed8a4f-a46a-4e41-9335-f10dc6338627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjqk\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-kube-api-access-ctjqk\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165800 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165835 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-config-data\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165862 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.166400 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.166472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.166592 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.167017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.167101 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-config-data\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.167612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.171954 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72ed8a4f-a46a-4e41-9335-f10dc6338627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.172094 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.172212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.176886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72ed8a4f-a46a-4e41-9335-f10dc6338627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.183431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjqk\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-kube-api-access-ctjqk\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.211039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.266895 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.266943 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267027 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267079 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267112 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267157 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267177 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267197 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267355 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267527 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267703 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267830 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.268121 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.268143 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.268154 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.270862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.273124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.279822 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b" (OuterVolumeSpecName: "kube-api-access-skg8b") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "kube-api-access-skg8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.283932 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.293548 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data" (OuterVolumeSpecName: "config-data") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.293664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info" (OuterVolumeSpecName: "pod-info") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.310324 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.331195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf" (OuterVolumeSpecName: "server-conf") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370180 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370222 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370236 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370250 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370261 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370288 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370301 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.397711 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.420339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.473294 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.473328 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.671447 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" path="/var/lib/kubelet/pods/e50b80fb-2251-49e7-a285-1276dbaa3237/volumes" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.769288 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.869568 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerDied","Data":"dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761"} Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.869599 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.869640 4790 scope.go:117] "RemoveContainer" containerID="9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.872324 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerStarted","Data":"20ca964e1e08449d26c45c0061f210e92a451b15f9229ba73e5bfff41e0c13ed"} Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.895741 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.898723 4790 scope.go:117] "RemoveContainer" containerID="a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.909009 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.922129 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: E0313 20:51:55.922941 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.922965 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" Mar 13 20:51:55 crc kubenswrapper[4790]: E0313 20:51:55.923002 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="setup-container" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.923009 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="setup-container" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.923182 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.924131 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931439 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931677 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931849 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931928 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6fg95" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.932260 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.932650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.932826 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.951258 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.105872 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106707 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.107055 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p45h4\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-kube-api-access-p45h4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.107115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.208864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.208950 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209002 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209097 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p45h4\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-kube-api-access-p45h4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209294 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209671 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.211478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.212158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.212496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.212823 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.213331 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.219359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.219504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.220062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.221882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.230789 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p45h4\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-kube-api-access-p45h4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.241165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.250840 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.337618 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.340164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.342845 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.354312 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.517653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.517986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518008 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518088 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518151 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619536 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620788 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620863 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.621508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.621751 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.645193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.718513 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.791887 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:56 crc kubenswrapper[4790]: W0313 20:51:56.791990 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac7c2bb_fa6a_437a_9af3_d4ffa930bdf9.slice/crio-5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463 WatchSource:0}: Error finding container 5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463: Status 404 returned error can't find the container with id 5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463 Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.892390 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerStarted","Data":"5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463"} Mar 13 20:51:57 crc kubenswrapper[4790]: W0313 20:51:57.192208 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f7ae1d_5633_4005_8813_533cadffdf5f.slice/crio-32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc WatchSource:0}: Error finding container 32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc: Status 404 returned error can't find the container with id 32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.193605 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.671424 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" path="/var/lib/kubelet/pods/c575f482-56cd-4dfc-84c6-c6bb922d56a9/volumes" Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.905081 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" exitCode=0 Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.905175 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerDied","Data":"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91"} Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.905233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerStarted","Data":"32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc"} Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.907161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerStarted","Data":"26817f017f043cd97911724e5d41f909397e98b30bfd97efcb9244f2cb38d580"} Mar 13 20:51:58 crc kubenswrapper[4790]: I0313 20:51:58.918534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerStarted","Data":"6cc57bdf5a38660fb3604a1deab2679244c408dc5a05666a57500776843ad98d"} Mar 13 20:51:58 crc kubenswrapper[4790]: I0313 20:51:58.921350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerStarted","Data":"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc"} Mar 13 20:51:58 crc kubenswrapper[4790]: I0313 20:51:58.987543 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" podStartSLOduration=2.987494076 podStartE2EDuration="2.987494076s" podCreationTimestamp="2026-03-13 20:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:51:58.973826334 +0000 UTC m=+1449.994942235" watchObservedRunningTime="2026-03-13 20:51:58.987494076 +0000 UTC m=+1450.008609967" Mar 13 20:51:59 crc kubenswrapper[4790]: I0313 20:51:59.947575 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.140507 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.143182 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.145764 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.147687 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.147863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.151243 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.199919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"auto-csr-approver-29557252-mfnmk\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.302189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"auto-csr-approver-29557252-mfnmk\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.323337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"auto-csr-approver-29557252-mfnmk\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.471457 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.899608 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.955664 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" event={"ID":"b77751d8-7e07-4d67-9bed-3858cbfc5c3f","Type":"ContainerStarted","Data":"cbc02170592008372a96345d06138ff81a99dceb83db04c5eb0d1033f77737c4"} Mar 13 20:52:02 crc kubenswrapper[4790]: I0313 20:52:02.976669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" event={"ID":"b77751d8-7e07-4d67-9bed-3858cbfc5c3f","Type":"ContainerDied","Data":"b5ea61f802c1b094e15351a6cc95042eca8f16ab2272c8f7af336afbb299a8d5"} Mar 13 20:52:02 crc kubenswrapper[4790]: I0313 20:52:02.976514 4790 generic.go:334] "Generic (PLEG): container finished" podID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerID="b5ea61f802c1b094e15351a6cc95042eca8f16ab2272c8f7af336afbb299a8d5" exitCode=0 Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.322130 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.379334 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.386223 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt" (OuterVolumeSpecName: "kube-api-access-jxqqt") pod "b77751d8-7e07-4d67-9bed-3858cbfc5c3f" (UID: "b77751d8-7e07-4d67-9bed-3858cbfc5c3f"). InnerVolumeSpecName "kube-api-access-jxqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.481612 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.997215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" event={"ID":"b77751d8-7e07-4d67-9bed-3858cbfc5c3f","Type":"ContainerDied","Data":"cbc02170592008372a96345d06138ff81a99dceb83db04c5eb0d1033f77737c4"} Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.997583 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbc02170592008372a96345d06138ff81a99dceb83db04c5eb0d1033f77737c4" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.997252 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:05 crc kubenswrapper[4790]: I0313 20:52:05.392488 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:52:05 crc kubenswrapper[4790]: I0313 20:52:05.400145 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:52:05 crc kubenswrapper[4790]: I0313 20:52:05.670490 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" path="/var/lib/kubelet/pods/97e8561a-a685-44f0-986c-1559e5818ba8/volumes" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.720421 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.777020 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.777600 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" containerID="cri-o://4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29" gracePeriod=10 Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.969437 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p5ml2"] Mar 13 20:52:06 crc kubenswrapper[4790]: E0313 20:52:06.969798 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerName="oc" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.969814 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerName="oc" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.970013 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerName="oc" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.971145 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.000966 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p5ml2"] Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.038440 4790 generic.go:334] "Generic (PLEG): container finished" podID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerID="4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29" exitCode=0 Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.038485 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerDied","Data":"4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29"} Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.041758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-svc\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.041884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.041925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042140 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042289 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042421 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9d8\" (UniqueName: \"kubernetes.io/projected/66175627-2b03-49c6-a7a1-de69f8851d9a-kube-api-access-hq9d8\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042529 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-config\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9d8\" (UniqueName: \"kubernetes.io/projected/66175627-2b03-49c6-a7a1-de69f8851d9a-kube-api-access-hq9d8\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-config\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-svc\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.145093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.145142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.145168 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.146818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.146999 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.147110 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.147827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.148307 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-svc\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.148853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-config\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.167300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9d8\" (UniqueName: \"kubernetes.io/projected/66175627-2b03-49c6-a7a1-de69f8851d9a-kube-api-access-hq9d8\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.289033 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.389633 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454283 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454438 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454570 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454632 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.458997 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw" (OuterVolumeSpecName: "kube-api-access-rz7vw") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "kube-api-access-rz7vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.521146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.525503 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config" (OuterVolumeSpecName: "config") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.532598 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: E0313 20:52:07.540677 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc podName:03ea3d76-1bca-44e8-986c-8e751336f93d nodeName:}" failed. No retries permitted until 2026-03-13 20:52:08.040652589 +0000 UTC m=+1459.061768480 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d") : error deleting /var/lib/kubelet/pods/03ea3d76-1bca-44e8-986c-8e751336f93d/volume-subpaths: remove /var/lib/kubelet/pods/03ea3d76-1bca-44e8-986c-8e751336f93d/volume-subpaths: no such file or directory Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.540954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557049 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557097 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557113 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557126 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557137 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.736064 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p5ml2"] Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.048276 4790 generic.go:334] "Generic (PLEG): container finished" podID="66175627-2b03-49c6-a7a1-de69f8851d9a" containerID="27871394f5ffcdc35185d80b9e0ce4c575067fe2fa1efdd455ca8a9c92d8ff49" exitCode=0 Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.048338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" event={"ID":"66175627-2b03-49c6-a7a1-de69f8851d9a","Type":"ContainerDied","Data":"27871394f5ffcdc35185d80b9e0ce4c575067fe2fa1efdd455ca8a9c92d8ff49"} Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.048364 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" event={"ID":"66175627-2b03-49c6-a7a1-de69f8851d9a","Type":"ContainerStarted","Data":"6927ac38b8d0f8f8ffc931229c562675d8587b274b5beeb240669881cdf429f0"} Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.050333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerDied","Data":"b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100"} Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.050408 4790 scope.go:117] "RemoveContainer" containerID="4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.050403 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.065716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.068314 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.167777 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.225118 4790 scope.go:117] "RemoveContainer" containerID="3bf4c1a3a8959712b6bdc6bb2a33893090891a1211e7646c25c1b2fcadfa4181" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.384180 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.394490 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.061688 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" event={"ID":"66175627-2b03-49c6-a7a1-de69f8851d9a","Type":"ContainerStarted","Data":"82ab853eb5a738c8ea6d020bf49ff5de2f7ce41d17db3f977cc89e0d8624b3de"} Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.062710 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.086388 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" podStartSLOduration=3.086352553 podStartE2EDuration="3.086352553s" podCreationTimestamp="2026-03-13 20:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:52:09.084907404 +0000 UTC m=+1460.106023305" watchObservedRunningTime="2026-03-13 20:52:09.086352553 +0000 UTC m=+1460.107468444" Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.697423 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" path="/var/lib/kubelet/pods/03ea3d76-1bca-44e8-986c-8e751336f93d/volumes" Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.290977 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.355745 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.356103 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" containerID="cri-o://038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" gracePeriod=10 Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.829966 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997681 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997779 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997868 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.998103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.008469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8" (OuterVolumeSpecName: "kube-api-access-9j4h8") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "kube-api-access-9j4h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.054202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.060464 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config" (OuterVolumeSpecName: "config") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.063406 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.066025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.066501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.073947 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100247 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100291 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100306 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100318 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100329 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100341 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100351 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147527 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" exitCode=0 Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147574 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerDied","Data":"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc"} Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerDied","Data":"32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc"} Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147636 4790 scope.go:117] "RemoveContainer" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147964 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.185503 4790 scope.go:117] "RemoveContainer" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.188684 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.198351 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.222937 4790 scope.go:117] "RemoveContainer" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" Mar 13 20:52:18 crc kubenswrapper[4790]: E0313 20:52:18.223940 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc\": container with ID starting with 038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc not found: ID does not exist" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.224022 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc"} err="failed to get container status \"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc\": rpc error: code = NotFound desc = could not find container \"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc\": container with ID starting with 038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc not found: ID does not exist" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.224054 4790 scope.go:117] "RemoveContainer" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" Mar 13 20:52:18 crc kubenswrapper[4790]: E0313 20:52:18.224367 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91\": container with ID starting with 9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91 not found: ID does not exist" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.224435 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91"} err="failed to get container status \"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91\": rpc error: code = NotFound desc = could not find container \"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91\": container with ID starting with 9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91 not found: ID does not exist" Mar 13 20:52:19 crc kubenswrapper[4790]: I0313 20:52:19.673803 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" path="/var/lib/kubelet/pods/f1f7ae1d-5633-4005-8813-533cadffdf5f/volumes" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.284037 4790 generic.go:334] "Generic (PLEG): container finished" podID="72ed8a4f-a46a-4e41-9335-f10dc6338627" containerID="26817f017f043cd97911724e5d41f909397e98b30bfd97efcb9244f2cb38d580" exitCode=0 Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.284126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerDied","Data":"26817f017f043cd97911724e5d41f909397e98b30bfd97efcb9244f2cb38d580"} Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378450 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h"] Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378895 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378912 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378938 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378945 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378966 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378974 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378986 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378991 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.379158 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.379175 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.379860 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.385327 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.385492 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.385332 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.387840 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.404914 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h"] Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.541807 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.542157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.542442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.542752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645114 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645312 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645455 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.649762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.650033 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.651268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.664100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.779263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.297561 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9" containerID="6cc57bdf5a38660fb3604a1deab2679244c408dc5a05666a57500776843ad98d" exitCode=0 Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.297633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerDied","Data":"6cc57bdf5a38660fb3604a1deab2679244c408dc5a05666a57500776843ad98d"} Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.301748 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerStarted","Data":"944ee40aa4f68b4e17b6cc6b58efac933bc0b5da588542fe5893c11c3a99ebe6"} Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.302274 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.333260 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h"] Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.359790 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.359766625 podStartE2EDuration="37.359766625s" podCreationTimestamp="2026-03-13 20:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:52:31.346473693 +0000 UTC m=+1482.367589594" watchObservedRunningTime="2026-03-13 20:52:31.359766625 +0000 UTC m=+1482.380882516" Mar 13 20:52:32 crc kubenswrapper[4790]: I0313 20:52:32.311834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerStarted","Data":"011eaa7798518916fd1cd36b7162e72c8aa57ffd69ac6ec084b6add865ff6d11"} Mar 13 20:52:32 crc kubenswrapper[4790]: I0313 20:52:32.315496 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerStarted","Data":"6af6fba67a214b440e989118f8ced7b17ad8d38fcc1fe03265f7ac7a6dce9d17"} Mar 13 20:52:32 crc kubenswrapper[4790]: I0313 20:52:32.315758 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.448163 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.448138399 podStartE2EDuration="44.448138399s" podCreationTimestamp="2026-03-13 20:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:52:32.342869749 +0000 UTC m=+1483.363985650" watchObservedRunningTime="2026-03-13 20:52:39.448138399 +0000 UTC m=+1490.469254290" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.456078 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.458101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.471181 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.526931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.527102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.527135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.629264 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.629767 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.629807 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.630528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.631231 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.658524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.789182 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:41 crc kubenswrapper[4790]: I0313 20:52:41.732337 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.412674 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerStarted","Data":"68cbadfb5fc5eaf6da98e8021273feb1407956bc4335ba31888729ab3136c705"} Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.416951 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" exitCode=0 Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.416994 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e"} Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.417021 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerStarted","Data":"a7c04727a428df15d4ea2c9336e79e6ee57e874b67d1a38f59d2d7e8b47dd15c"} Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.431007 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" podStartSLOduration=2.387143774 podStartE2EDuration="12.430983873s" podCreationTimestamp="2026-03-13 20:52:30 +0000 UTC" firstStartedPulling="2026-03-13 20:52:31.32207379 +0000 UTC m=+1482.343189691" lastFinishedPulling="2026-03-13 20:52:41.365913899 +0000 UTC m=+1492.387029790" observedRunningTime="2026-03-13 20:52:42.429630617 +0000 UTC m=+1493.450746508" watchObservedRunningTime="2026-03-13 20:52:42.430983873 +0000 UTC m=+1493.452099764" Mar 13 20:52:44 crc kubenswrapper[4790]: I0313 20:52:44.439541 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerStarted","Data":"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf"} Mar 13 20:52:45 crc kubenswrapper[4790]: I0313 20:52:45.313623 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 20:52:45 crc kubenswrapper[4790]: I0313 20:52:45.454577 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" exitCode=0 Mar 13 20:52:45 crc kubenswrapper[4790]: I0313 20:52:45.454668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf"} Mar 13 20:52:46 crc kubenswrapper[4790]: I0313 20:52:46.254593 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:52:46 crc kubenswrapper[4790]: I0313 20:52:46.468402 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerStarted","Data":"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819"} Mar 13 20:52:46 crc kubenswrapper[4790]: I0313 20:52:46.492713 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xck9d" podStartSLOduration=4.032197105 podStartE2EDuration="7.492694107s" podCreationTimestamp="2026-03-13 20:52:39 +0000 UTC" firstStartedPulling="2026-03-13 20:52:42.418460603 +0000 UTC m=+1493.439576494" lastFinishedPulling="2026-03-13 20:52:45.878957605 +0000 UTC m=+1496.900073496" observedRunningTime="2026-03-13 20:52:46.487321051 +0000 UTC m=+1497.508436942" watchObservedRunningTime="2026-03-13 20:52:46.492694107 +0000 UTC m=+1497.513809998" Mar 13 20:52:49 crc kubenswrapper[4790]: I0313 20:52:49.790305 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:49 crc kubenswrapper[4790]: I0313 20:52:49.790861 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:50 crc kubenswrapper[4790]: I0313 20:52:50.833522 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xck9d" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" probeResult="failure" output=< Mar 13 20:52:50 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:52:50 crc kubenswrapper[4790]: > Mar 13 20:52:52 crc kubenswrapper[4790]: I0313 20:52:52.985722 4790 scope.go:117] "RemoveContainer" containerID="a76e1c0d1beff75ffaa42ee8715fd9733a320b575bcb2a1602abbb7840ddf694" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.019640 4790 scope.go:117] "RemoveContainer" containerID="7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.053605 4790 scope.go:117] "RemoveContainer" containerID="a469cae8d28a17763807dc70d5fbc5f435ef49995e55c306927cfc053eea835d" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.073599 4790 scope.go:117] "RemoveContainer" containerID="08d59d9ecbc8376b9de39bc3a93a8ca2a0b84d09598e5daa63ce7fe053fdaadf" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.541855 4790 generic.go:334] "Generic (PLEG): container finished" podID="37459d15-1599-492b-8710-7723829a096d" containerID="68cbadfb5fc5eaf6da98e8021273feb1407956bc4335ba31888729ab3136c705" exitCode=0 Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.542202 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerDied","Data":"68cbadfb5fc5eaf6da98e8021273feb1407956bc4335ba31888729ab3136c705"} Mar 13 20:52:54 crc kubenswrapper[4790]: I0313 20:52:54.993271 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.033897 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.034044 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.034187 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.034271 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.041164 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.044610 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn" (OuterVolumeSpecName: "kube-api-access-56jjn") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "kube-api-access-56jjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.061649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.066833 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory" (OuterVolumeSpecName: "inventory") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138176 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138470 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138557 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138670 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.561578 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerDied","Data":"011eaa7798518916fd1cd36b7162e72c8aa57ffd69ac6ec084b6add865ff6d11"} Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.561627 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011eaa7798518916fd1cd36b7162e72c8aa57ffd69ac6ec084b6add865ff6d11" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.561648 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.647077 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2"] Mar 13 20:52:55 crc kubenswrapper[4790]: E0313 20:52:55.647595 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37459d15-1599-492b-8710-7723829a096d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.647621 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37459d15-1599-492b-8710-7723829a096d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.647865 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37459d15-1599-492b-8710-7723829a096d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.648694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.651904 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.652478 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.652849 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.654292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.657568 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2"] Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.750011 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.750174 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.750270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.852311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.852441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.852523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.856336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.865483 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.871064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.972916 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:56 crc kubenswrapper[4790]: I0313 20:52:56.490622 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2"] Mar 13 20:52:56 crc kubenswrapper[4790]: I0313 20:52:56.572783 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerStarted","Data":"23954afae29dc0d42d7aa797a383ebbedc7c5ae34a41912117cbf97794e9592c"} Mar 13 20:52:57 crc kubenswrapper[4790]: I0313 20:52:57.583702 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerStarted","Data":"e536e5bc016726000a9f433d8e01bfad7c9bcef53ef7691e5f152680b0727e30"} Mar 13 20:52:57 crc kubenswrapper[4790]: I0313 20:52:57.615998 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" podStartSLOduration=2.201365979 podStartE2EDuration="2.615977662s" podCreationTimestamp="2026-03-13 20:52:55 +0000 UTC" firstStartedPulling="2026-03-13 20:52:56.496725773 +0000 UTC m=+1507.517841664" lastFinishedPulling="2026-03-13 20:52:56.911337456 +0000 UTC m=+1507.932453347" observedRunningTime="2026-03-13 20:52:57.609953748 +0000 UTC m=+1508.631069639" watchObservedRunningTime="2026-03-13 20:52:57.615977662 +0000 UTC m=+1508.637093553" Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.603533 4790 generic.go:334] "Generic (PLEG): container finished" podID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerID="e536e5bc016726000a9f433d8e01bfad7c9bcef53ef7691e5f152680b0727e30" exitCode=0 Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.603645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerDied","Data":"e536e5bc016726000a9f433d8e01bfad7c9bcef53ef7691e5f152680b0727e30"} Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.837859 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.890785 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:53:00 crc kubenswrapper[4790]: I0313 20:53:00.071368 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:53:00 crc kubenswrapper[4790]: I0313 20:53:00.994984 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.062136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"6383acac-fad0-45d2-8263-da2ceb0b9e83\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.062231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"6383acac-fad0-45d2-8263-da2ceb0b9e83\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.062310 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"6383acac-fad0-45d2-8263-da2ceb0b9e83\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.069233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd" (OuterVolumeSpecName: "kube-api-access-pknwd") pod "6383acac-fad0-45d2-8263-da2ceb0b9e83" (UID: "6383acac-fad0-45d2-8263-da2ceb0b9e83"). InnerVolumeSpecName "kube-api-access-pknwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.093501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory" (OuterVolumeSpecName: "inventory") pod "6383acac-fad0-45d2-8263-da2ceb0b9e83" (UID: "6383acac-fad0-45d2-8263-da2ceb0b9e83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.102589 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6383acac-fad0-45d2-8263-da2ceb0b9e83" (UID: "6383acac-fad0-45d2-8263-da2ceb0b9e83"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.165683 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.166001 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.166137 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerDied","Data":"23954afae29dc0d42d7aa797a383ebbedc7c5ae34a41912117cbf97794e9592c"} Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623317 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23954afae29dc0d42d7aa797a383ebbedc7c5ae34a41912117cbf97794e9592c" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623263 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623706 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xck9d" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" containerID="cri-o://c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" gracePeriod=2 Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.698442 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n"] Mar 13 20:53:01 crc kubenswrapper[4790]: E0313 20:53:01.698999 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.699090 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.699351 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.699977 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.705534 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.705907 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.706075 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.706253 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.709950 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n"] Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779315 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881307 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.886238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.889092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.893303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.900274 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.069885 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.118428 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.189023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.189350 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.189425 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.190130 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities" (OuterVolumeSpecName: "utilities") pod "8e61b37d-27a6-44dc-83c2-1aa0b9850465" (UID: "8e61b37d-27a6-44dc-83c2-1aa0b9850465"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.196144 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz" (OuterVolumeSpecName: "kube-api-access-j5xmz") pod "8e61b37d-27a6-44dc-83c2-1aa0b9850465" (UID: "8e61b37d-27a6-44dc-83c2-1aa0b9850465"). InnerVolumeSpecName "kube-api-access-j5xmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.291982 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.292026 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.360566 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e61b37d-27a6-44dc-83c2-1aa0b9850465" (UID: "8e61b37d-27a6-44dc-83c2-1aa0b9850465"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.394084 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.629643 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n"] Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634543 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" exitCode=0 Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634604 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634624 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819"} Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634684 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"a7c04727a428df15d4ea2c9336e79e6ee57e874b67d1a38f59d2d7e8b47dd15c"} Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634714 4790 scope.go:117] "RemoveContainer" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.659970 4790 scope.go:117] "RemoveContainer" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.674160 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.687627 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.690810 4790 scope.go:117] "RemoveContainer" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.707346 4790 scope.go:117] "RemoveContainer" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" Mar 13 20:53:02 crc kubenswrapper[4790]: E0313 20:53:02.708025 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819\": container with ID starting with c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819 not found: ID does not exist" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708073 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819"} err="failed to get container status \"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819\": rpc error: code = NotFound desc = could not find container \"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819\": container with ID starting with c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819 not found: ID does not exist" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708105 4790 scope.go:117] "RemoveContainer" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" Mar 13 20:53:02 crc kubenswrapper[4790]: E0313 20:53:02.708670 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf\": container with ID starting with 555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf not found: ID does not exist" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708723 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf"} err="failed to get container status \"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf\": rpc error: code = NotFound desc = could not find container \"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf\": container with ID starting with 555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf not found: ID does not exist" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708754 4790 scope.go:117] "RemoveContainer" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" Mar 13 20:53:02 crc kubenswrapper[4790]: E0313 20:53:02.709087 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e\": container with ID starting with 9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e not found: ID does not exist" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.709127 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e"} err="failed to get container status \"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e\": rpc error: code = NotFound desc = could not find container \"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e\": container with ID starting with 9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e not found: ID does not exist" Mar 13 20:53:03 crc kubenswrapper[4790]: I0313 20:53:03.645978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerStarted","Data":"ef1ad01ff1610150e75c805dfbe677ad94c23d1f578c4b9bb8893fd71bbdb07d"} Mar 13 20:53:03 crc kubenswrapper[4790]: I0313 20:53:03.646327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerStarted","Data":"8a8a4b31d38642270b5c6ca8e8476670fc95c963faff03b5219523182e59cc45"} Mar 13 20:53:03 crc kubenswrapper[4790]: I0313 20:53:03.681191 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" path="/var/lib/kubelet/pods/8e61b37d-27a6-44dc-83c2-1aa0b9850465/volumes" Mar 13 20:53:44 crc kubenswrapper[4790]: I0313 20:53:44.015528 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:53:44 crc kubenswrapper[4790]: I0313 20:53:44.016036 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:53:53 crc kubenswrapper[4790]: I0313 20:53:53.250604 4790 scope.go:117] "RemoveContainer" containerID="be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.162001 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" podStartSLOduration=58.72791871 podStartE2EDuration="59.161983362s" podCreationTimestamp="2026-03-13 20:53:01 +0000 UTC" firstStartedPulling="2026-03-13 20:53:02.633822876 +0000 UTC m=+1513.654938767" lastFinishedPulling="2026-03-13 20:53:03.067887528 +0000 UTC m=+1514.089003419" observedRunningTime="2026-03-13 20:53:03.668169824 +0000 UTC m=+1514.689285735" watchObservedRunningTime="2026-03-13 20:54:00.161983362 +0000 UTC m=+1571.183099273" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.172839 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 20:54:00 crc kubenswrapper[4790]: E0313 20:54:00.173349 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-content" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173390 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-content" Mar 13 20:54:00 crc kubenswrapper[4790]: E0313 20:54:00.173412 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173421 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" Mar 13 20:54:00 crc kubenswrapper[4790]: E0313 20:54:00.173462 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-utilities" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173471 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-utilities" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173717 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.174505 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.180207 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.180708 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.181608 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.186892 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.196269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"auto-csr-approver-29557254-62lxw\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.297103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"auto-csr-approver-29557254-62lxw\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.316820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"auto-csr-approver-29557254-62lxw\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.494272 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.953659 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.954214 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:54:01 crc kubenswrapper[4790]: I0313 20:54:01.149562 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557254-62lxw" event={"ID":"173eb1b0-728a-4420-bfab-ba33ae08f5eb","Type":"ContainerStarted","Data":"95504a8a944ca2a2025a00a28745f457433239f1a60e421ea50cc57ac0d77836"} Mar 13 20:54:03 crc kubenswrapper[4790]: I0313 20:54:03.181239 4790 generic.go:334] "Generic (PLEG): container finished" podID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerID="1354228427a90e6609d9b0170fc1b61342fc6ff24449709c9abd0f642ea90a66" exitCode=0 Mar 13 20:54:03 crc kubenswrapper[4790]: I0313 20:54:03.181311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557254-62lxw" event={"ID":"173eb1b0-728a-4420-bfab-ba33ae08f5eb","Type":"ContainerDied","Data":"1354228427a90e6609d9b0170fc1b61342fc6ff24449709c9abd0f642ea90a66"} Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.483176 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.679197 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.687220 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s" (OuterVolumeSpecName: "kube-api-access-zwk9s") pod "173eb1b0-728a-4420-bfab-ba33ae08f5eb" (UID: "173eb1b0-728a-4420-bfab-ba33ae08f5eb"). InnerVolumeSpecName "kube-api-access-zwk9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.782411 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.201253 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557254-62lxw" event={"ID":"173eb1b0-728a-4420-bfab-ba33ae08f5eb","Type":"ContainerDied","Data":"95504a8a944ca2a2025a00a28745f457433239f1a60e421ea50cc57ac0d77836"} Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.201621 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95504a8a944ca2a2025a00a28745f457433239f1a60e421ea50cc57ac0d77836" Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.201344 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.548977 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.564152 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.669772 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" path="/var/lib/kubelet/pods/eda4da8c-f54a-4c25-9669-ff180aa0b9a9/volumes" Mar 13 20:54:14 crc kubenswrapper[4790]: I0313 20:54:14.016131 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:54:14 crc kubenswrapper[4790]: I0313 20:54:14.017548 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.929500 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:36 crc kubenswrapper[4790]: E0313 20:54:36.930444 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerName="oc" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.930457 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerName="oc" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.930688 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerName="oc" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.932018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.945303 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.108584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.108651 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.108712 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210770 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210985 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.234159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.252556 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.724827 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:38 crc kubenswrapper[4790]: I0313 20:54:38.491071 4790 generic.go:334] "Generic (PLEG): container finished" podID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" exitCode=0 Mar 13 20:54:38 crc kubenswrapper[4790]: I0313 20:54:38.491192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50"} Mar 13 20:54:38 crc kubenswrapper[4790]: I0313 20:54:38.491471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerStarted","Data":"865528cf8a33a2814a57e2c3535b244c3a265fa0760972e7defe25f2fc5fe2d7"} Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.015589 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.016175 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.016229 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.017134 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.017200 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" gracePeriod=600 Mar 13 20:54:44 crc kubenswrapper[4790]: E0313 20:54:44.135978 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.555802 4790 generic.go:334] "Generic (PLEG): container finished" podID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" exitCode=0 Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.556186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df"} Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.561236 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" exitCode=0 Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.561308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e"} Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.561460 4790 scope.go:117] "RemoveContainer" containerID="7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.562720 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:54:44 crc kubenswrapper[4790]: E0313 20:54:44.563543 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:54:45 crc kubenswrapper[4790]: I0313 20:54:45.573248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerStarted","Data":"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8"} Mar 13 20:54:45 crc kubenswrapper[4790]: I0313 20:54:45.595697 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lczcj" podStartSLOduration=3.012902525 podStartE2EDuration="9.595678316s" podCreationTimestamp="2026-03-13 20:54:36 +0000 UTC" firstStartedPulling="2026-03-13 20:54:38.493065916 +0000 UTC m=+1609.514181807" lastFinishedPulling="2026-03-13 20:54:45.075841707 +0000 UTC m=+1616.096957598" observedRunningTime="2026-03-13 20:54:45.591553112 +0000 UTC m=+1616.612669013" watchObservedRunningTime="2026-03-13 20:54:45.595678316 +0000 UTC m=+1616.616794207" Mar 13 20:54:47 crc kubenswrapper[4790]: I0313 20:54:47.252683 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:47 crc kubenswrapper[4790]: I0313 20:54:47.253034 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:47 crc kubenswrapper[4790]: I0313 20:54:47.300985 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:53 crc kubenswrapper[4790]: I0313 20:54:53.351712 4790 scope.go:117] "RemoveContainer" containerID="3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a" Mar 13 20:54:55 crc kubenswrapper[4790]: I0313 20:54:55.659438 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:54:55 crc kubenswrapper[4790]: E0313 20:54:55.659897 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:54:57 crc kubenswrapper[4790]: I0313 20:54:57.309489 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:57 crc kubenswrapper[4790]: I0313 20:54:57.366885 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:57 crc kubenswrapper[4790]: I0313 20:54:57.689801 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lczcj" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" containerID="cri-o://6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" gracePeriod=2 Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.143907 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.268258 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"58e8b831-38b3-41f5-b0db-341376a43ee7\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.268369 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"58e8b831-38b3-41f5-b0db-341376a43ee7\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.268574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"58e8b831-38b3-41f5-b0db-341376a43ee7\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.269368 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities" (OuterVolumeSpecName: "utilities") pod "58e8b831-38b3-41f5-b0db-341376a43ee7" (UID: "58e8b831-38b3-41f5-b0db-341376a43ee7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.273912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7" (OuterVolumeSpecName: "kube-api-access-59rk7") pod "58e8b831-38b3-41f5-b0db-341376a43ee7" (UID: "58e8b831-38b3-41f5-b0db-341376a43ee7"). InnerVolumeSpecName "kube-api-access-59rk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.296754 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58e8b831-38b3-41f5-b0db-341376a43ee7" (UID: "58e8b831-38b3-41f5-b0db-341376a43ee7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.370418 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.370459 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.370470 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.698955 4790 generic.go:334] "Generic (PLEG): container finished" podID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" exitCode=0 Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699000 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8"} Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"865528cf8a33a2814a57e2c3535b244c3a265fa0760972e7defe25f2fc5fe2d7"} Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699404 4790 scope.go:117] "RemoveContainer" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.719712 4790 scope.go:117] "RemoveContainer" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.735855 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.744799 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.760585 4790 scope.go:117] "RemoveContainer" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.788019 4790 scope.go:117] "RemoveContainer" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" Mar 13 20:54:58 crc kubenswrapper[4790]: E0313 20:54:58.788549 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8\": container with ID starting with 6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8 not found: ID does not exist" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.788596 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8"} err="failed to get container status \"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8\": rpc error: code = NotFound desc = could not find container \"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8\": container with ID starting with 6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8 not found: ID does not exist" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.788618 4790 scope.go:117] "RemoveContainer" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" Mar 13 20:54:58 crc kubenswrapper[4790]: E0313 20:54:58.789088 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df\": container with ID starting with 3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df not found: ID does not exist" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.789115 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df"} err="failed to get container status \"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df\": rpc error: code = NotFound desc = could not find container \"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df\": container with ID starting with 3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df not found: ID does not exist" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.789132 4790 scope.go:117] "RemoveContainer" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" Mar 13 20:54:58 crc kubenswrapper[4790]: E0313 20:54:58.795078 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50\": container with ID starting with fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50 not found: ID does not exist" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.795135 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50"} err="failed to get container status \"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50\": rpc error: code = NotFound desc = could not find container \"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50\": container with ID starting with fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50 not found: ID does not exist" Mar 13 20:54:59 crc kubenswrapper[4790]: I0313 20:54:59.673442 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" path="/var/lib/kubelet/pods/58e8b831-38b3-41f5-b0db-341376a43ee7/volumes" Mar 13 20:55:07 crc kubenswrapper[4790]: I0313 20:55:07.659655 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:07 crc kubenswrapper[4790]: E0313 20:55:07.660333 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.179767 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:09 crc kubenswrapper[4790]: E0313 20:55:09.180174 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-content" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180187 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-content" Mar 13 20:55:09 crc kubenswrapper[4790]: E0313 20:55:09.180201 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180209 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" Mar 13 20:55:09 crc kubenswrapper[4790]: E0313 20:55:09.180230 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-utilities" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180239 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-utilities" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180468 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.181760 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.194576 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.294866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.294927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.294981 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397020 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397259 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.398071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.420677 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.500901 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.020711 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.848319 4790 generic.go:334] "Generic (PLEG): container finished" podID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" exitCode=0 Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.848423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f"} Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.848657 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerStarted","Data":"f2d2a481f1c0ef44a5d063d8a491e0780cdc0c10daa3fa086aeadd318fcf2d52"} Mar 13 20:55:11 crc kubenswrapper[4790]: I0313 20:55:11.860062 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerStarted","Data":"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553"} Mar 13 20:55:12 crc kubenswrapper[4790]: I0313 20:55:12.871864 4790 generic.go:334] "Generic (PLEG): container finished" podID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" exitCode=0 Mar 13 20:55:12 crc kubenswrapper[4790]: I0313 20:55:12.871936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553"} Mar 13 20:55:13 crc kubenswrapper[4790]: I0313 20:55:13.881838 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerStarted","Data":"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109"} Mar 13 20:55:13 crc kubenswrapper[4790]: I0313 20:55:13.902454 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfmsk" podStartSLOduration=2.425414831 podStartE2EDuration="4.902435033s" podCreationTimestamp="2026-03-13 20:55:09 +0000 UTC" firstStartedPulling="2026-03-13 20:55:10.850659753 +0000 UTC m=+1641.871775654" lastFinishedPulling="2026-03-13 20:55:13.327679965 +0000 UTC m=+1644.348795856" observedRunningTime="2026-03-13 20:55:13.899338338 +0000 UTC m=+1644.920454229" watchObservedRunningTime="2026-03-13 20:55:13.902435033 +0000 UTC m=+1644.923550934" Mar 13 20:55:19 crc kubenswrapper[4790]: I0313 20:55:19.501753 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:19 crc kubenswrapper[4790]: I0313 20:55:19.502004 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:19 crc kubenswrapper[4790]: I0313 20:55:19.547422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:20 crc kubenswrapper[4790]: I0313 20:55:20.006310 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:20 crc kubenswrapper[4790]: I0313 20:55:20.051054 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:21 crc kubenswrapper[4790]: I0313 20:55:21.660916 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:21 crc kubenswrapper[4790]: E0313 20:55:21.661680 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:21 crc kubenswrapper[4790]: I0313 20:55:21.959423 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sfmsk" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" containerID="cri-o://3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" gracePeriod=2 Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.403107 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.591267 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"13b049e8-3316-420e-9ec2-a83f7c645d0d\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.591430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"13b049e8-3316-420e-9ec2-a83f7c645d0d\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.592192 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"13b049e8-3316-420e-9ec2-a83f7c645d0d\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.593206 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities" (OuterVolumeSpecName: "utilities") pod "13b049e8-3316-420e-9ec2-a83f7c645d0d" (UID: "13b049e8-3316-420e-9ec2-a83f7c645d0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.597148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8" (OuterVolumeSpecName: "kube-api-access-s4bp8") pod "13b049e8-3316-420e-9ec2-a83f7c645d0d" (UID: "13b049e8-3316-420e-9ec2-a83f7c645d0d"). InnerVolumeSpecName "kube-api-access-s4bp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.694945 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.694995 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") on node \"crc\" DevicePath \"\"" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.972863 4790 generic.go:334] "Generic (PLEG): container finished" podID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" exitCode=0 Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.972911 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.972930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109"} Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.973317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"f2d2a481f1c0ef44a5d063d8a491e0780cdc0c10daa3fa086aeadd318fcf2d52"} Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.973336 4790 scope.go:117] "RemoveContainer" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.993682 4790 scope.go:117] "RemoveContainer" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.011838 4790 scope.go:117] "RemoveContainer" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.052977 4790 scope.go:117] "RemoveContainer" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" Mar 13 20:55:23 crc kubenswrapper[4790]: E0313 20:55:23.053353 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109\": container with ID starting with 3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109 not found: ID does not exist" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053421 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109"} err="failed to get container status \"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109\": rpc error: code = NotFound desc = could not find container \"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109\": container with ID starting with 3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109 not found: ID does not exist" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053456 4790 scope.go:117] "RemoveContainer" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" Mar 13 20:55:23 crc kubenswrapper[4790]: E0313 20:55:23.053764 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553\": container with ID starting with 0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553 not found: ID does not exist" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053794 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553"} err="failed to get container status \"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553\": rpc error: code = NotFound desc = could not find container \"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553\": container with ID starting with 0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553 not found: ID does not exist" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053812 4790 scope.go:117] "RemoveContainer" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" Mar 13 20:55:23 crc kubenswrapper[4790]: E0313 20:55:23.054045 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f\": container with ID starting with c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f not found: ID does not exist" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.054080 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f"} err="failed to get container status \"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f\": rpc error: code = NotFound desc = could not find container \"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f\": container with ID starting with c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f not found: ID does not exist" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.756195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13b049e8-3316-420e-9ec2-a83f7c645d0d" (UID: "13b049e8-3316-420e-9ec2-a83f7c645d0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.813123 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.912705 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.923395 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:25 crc kubenswrapper[4790]: I0313 20:55:25.670597 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" path="/var/lib/kubelet/pods/13b049e8-3316-420e-9ec2-a83f7c645d0d/volumes" Mar 13 20:55:32 crc kubenswrapper[4790]: I0313 20:55:32.665629 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:32 crc kubenswrapper[4790]: E0313 20:55:32.666765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:43 crc kubenswrapper[4790]: I0313 20:55:43.661659 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:43 crc kubenswrapper[4790]: E0313 20:55:43.662334 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.028753 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:55:49 crc kubenswrapper[4790]: E0313 20:55:49.029755 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-content" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.029772 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-content" Mar 13 20:55:49 crc kubenswrapper[4790]: E0313 20:55:49.029794 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-utilities" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.029803 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-utilities" Mar 13 20:55:49 crc kubenswrapper[4790]: E0313 20:55:49.029838 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.029845 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.030055 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.031550 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.051384 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.100877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.100957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.100983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203792 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.226308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.347342 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.931439 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:55:50 crc kubenswrapper[4790]: I0313 20:55:50.230039 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" exitCode=0 Mar 13 20:55:50 crc kubenswrapper[4790]: I0313 20:55:50.230093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9"} Mar 13 20:55:50 crc kubenswrapper[4790]: I0313 20:55:50.230121 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerStarted","Data":"3fa0411158a515661fc93f23367f71b3fb55a578ca344043d47fb79a8a6b6cd1"} Mar 13 20:55:51 crc kubenswrapper[4790]: I0313 20:55:51.242411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerStarted","Data":"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d"} Mar 13 20:55:52 crc kubenswrapper[4790]: I0313 20:55:52.253734 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" exitCode=0 Mar 13 20:55:52 crc kubenswrapper[4790]: I0313 20:55:52.253779 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d"} Mar 13 20:55:53 crc kubenswrapper[4790]: I0313 20:55:53.264764 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerStarted","Data":"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7"} Mar 13 20:55:53 crc kubenswrapper[4790]: I0313 20:55:53.285664 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vnc4k" podStartSLOduration=1.810891243 podStartE2EDuration="4.285647563s" podCreationTimestamp="2026-03-13 20:55:49 +0000 UTC" firstStartedPulling="2026-03-13 20:55:50.232608568 +0000 UTC m=+1681.253724459" lastFinishedPulling="2026-03-13 20:55:52.707364888 +0000 UTC m=+1683.728480779" observedRunningTime="2026-03-13 20:55:53.280990285 +0000 UTC m=+1684.302106176" watchObservedRunningTime="2026-03-13 20:55:53.285647563 +0000 UTC m=+1684.306763454" Mar 13 20:55:55 crc kubenswrapper[4790]: I0313 20:55:55.660300 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:55 crc kubenswrapper[4790]: E0313 20:55:55.660871 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:59 crc kubenswrapper[4790]: I0313 20:55:59.348305 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:59 crc kubenswrapper[4790]: I0313 20:55:59.349963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:59 crc kubenswrapper[4790]: I0313 20:55:59.400913 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.148694 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.149869 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.159010 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.159038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.159438 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.162037 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.224311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"auto-csr-approver-29557256-v26h5\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.327482 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"auto-csr-approver-29557256-v26h5\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.348475 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"auto-csr-approver-29557256-v26h5\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.389509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.454757 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.468054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.957208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 20:56:00 crc kubenswrapper[4790]: W0313 20:56:00.958948 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ffc58ad_c12d_4165_bc92_1e948aa14c42.slice/crio-11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4 WatchSource:0}: Error finding container 11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4: Status 404 returned error can't find the container with id 11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4 Mar 13 20:56:01 crc kubenswrapper[4790]: I0313 20:56:01.334910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerStarted","Data":"11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4"} Mar 13 20:56:02 crc kubenswrapper[4790]: I0313 20:56:02.345818 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerStarted","Data":"089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093"} Mar 13 20:56:02 crc kubenswrapper[4790]: I0313 20:56:02.346165 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vnc4k" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" containerID="cri-o://5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" gracePeriod=2 Mar 13 20:56:02 crc kubenswrapper[4790]: I0313 20:56:02.365327 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557256-v26h5" podStartSLOduration=1.549286537 podStartE2EDuration="2.365307644s" podCreationTimestamp="2026-03-13 20:56:00 +0000 UTC" firstStartedPulling="2026-03-13 20:56:00.961066156 +0000 UTC m=+1691.982182047" lastFinishedPulling="2026-03-13 20:56:01.777087263 +0000 UTC m=+1692.798203154" observedRunningTime="2026-03-13 20:56:02.360909283 +0000 UTC m=+1693.382025194" watchObservedRunningTime="2026-03-13 20:56:02.365307644 +0000 UTC m=+1693.386423535" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.291330 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361634 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" exitCode=0 Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361690 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361721 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7"} Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"3fa0411158a515661fc93f23367f71b3fb55a578ca344043d47fb79a8a6b6cd1"} Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361772 4790 scope.go:117] "RemoveContainer" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.363592 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerID="089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093" exitCode=0 Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.363638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerDied","Data":"089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093"} Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.387931 4790 scope.go:117] "RemoveContainer" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.390444 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.390504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.390548 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.391698 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities" (OuterVolumeSpecName: "utilities") pod "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" (UID: "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.404918 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb" (OuterVolumeSpecName: "kube-api-access-mzxcb") pod "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" (UID: "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca"). InnerVolumeSpecName "kube-api-access-mzxcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.409208 4790 scope.go:117] "RemoveContainer" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.462295 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" (UID: "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.492931 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.493190 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.493278 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.494376 4790 scope.go:117] "RemoveContainer" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" Mar 13 20:56:03 crc kubenswrapper[4790]: E0313 20:56:03.495015 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7\": container with ID starting with 5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7 not found: ID does not exist" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495061 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7"} err="failed to get container status \"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7\": rpc error: code = NotFound desc = could not find container \"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7\": container with ID starting with 5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7 not found: ID does not exist" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495086 4790 scope.go:117] "RemoveContainer" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" Mar 13 20:56:03 crc kubenswrapper[4790]: E0313 20:56:03.495554 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d\": container with ID starting with f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d not found: ID does not exist" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495618 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d"} err="failed to get container status \"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d\": rpc error: code = NotFound desc = could not find container \"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d\": container with ID starting with f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d not found: ID does not exist" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495666 4790 scope.go:117] "RemoveContainer" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" Mar 13 20:56:03 crc kubenswrapper[4790]: E0313 20:56:03.495968 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9\": container with ID starting with a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9 not found: ID does not exist" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.496063 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9"} err="failed to get container status \"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9\": rpc error: code = NotFound desc = could not find container \"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9\": container with ID starting with a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9 not found: ID does not exist" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.710424 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.718503 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.706552 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.828204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.832339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6" (OuterVolumeSpecName: "kube-api-access-5nrf6") pod "4ffc58ad-c12d-4165-bc92-1e948aa14c42" (UID: "4ffc58ad-c12d-4165-bc92-1e948aa14c42"). InnerVolumeSpecName "kube-api-access-5nrf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.930415 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.386432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerDied","Data":"11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4"} Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.386477 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.386688 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.437436 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.446918 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.672922 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" path="/var/lib/kubelet/pods/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca/volumes" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.674082 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" path="/var/lib/kubelet/pods/d00a5fd8-e634-4969-90ad-6850179e7de1/volumes" Mar 13 20:56:07 crc kubenswrapper[4790]: I0313 20:56:07.407594 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerID="ef1ad01ff1610150e75c805dfbe677ad94c23d1f578c4b9bb8893fd71bbdb07d" exitCode=0 Mar 13 20:56:07 crc kubenswrapper[4790]: I0313 20:56:07.407681 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerDied","Data":"ef1ad01ff1610150e75c805dfbe677ad94c23d1f578c4b9bb8893fd71bbdb07d"} Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.660715 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:08 crc kubenswrapper[4790]: E0313 20:56:08.661314 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.811141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913735 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913835 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.920357 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.920745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4" (OuterVolumeSpecName: "kube-api-access-ctzx4") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "kube-api-access-ctzx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.942467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory" (OuterVolumeSpecName: "inventory") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.951896 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016226 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016261 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016271 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016281 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.428246 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerDied","Data":"8a8a4b31d38642270b5c6ca8e8476670fc95c963faff03b5219523182e59cc45"} Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.428285 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.428292 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a8a4b31d38642270b5c6ca8e8476670fc95c963faff03b5219523182e59cc45" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.501714 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58"] Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.502746 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.502872 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.502961 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-utilities" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503033 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-utilities" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.503104 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerName="oc" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503184 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerName="oc" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.503281 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-content" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503365 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-content" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.503473 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503544 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503846 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerName="oc" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503949 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.504053 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.504942 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.506650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.506844 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.507325 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.507706 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.513768 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58"] Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.629877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.630123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.630175 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.731703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.731751 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.731925 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.745246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.745349 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.750586 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.827613 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:10 crc kubenswrapper[4790]: I0313 20:56:10.201759 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58"] Mar 13 20:56:10 crc kubenswrapper[4790]: W0313 20:56:10.205226 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod304addb4_f579_42f8_87d8_8e15b713aef2.slice/crio-18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf WatchSource:0}: Error finding container 18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf: Status 404 returned error can't find the container with id 18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf Mar 13 20:56:10 crc kubenswrapper[4790]: I0313 20:56:10.438919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerStarted","Data":"18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf"} Mar 13 20:56:11 crc kubenswrapper[4790]: I0313 20:56:11.465791 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerStarted","Data":"c52e39aa381aa5d2fab1bd41bd98b85078dce93efb6dfc416500534a84998765"} Mar 13 20:56:11 crc kubenswrapper[4790]: I0313 20:56:11.486070 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" podStartSLOduration=2.001567418 podStartE2EDuration="2.486053856s" podCreationTimestamp="2026-03-13 20:56:09 +0000 UTC" firstStartedPulling="2026-03-13 20:56:10.207686019 +0000 UTC m=+1701.228801910" lastFinishedPulling="2026-03-13 20:56:10.692172447 +0000 UTC m=+1701.713288348" observedRunningTime="2026-03-13 20:56:11.484284897 +0000 UTC m=+1702.505400788" watchObservedRunningTime="2026-03-13 20:56:11.486053856 +0000 UTC m=+1702.507169747" Mar 13 20:56:21 crc kubenswrapper[4790]: I0313 20:56:21.659864 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:21 crc kubenswrapper[4790]: E0313 20:56:21.660576 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:32 crc kubenswrapper[4790]: I0313 20:56:32.661114 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:32 crc kubenswrapper[4790]: E0313 20:56:32.661971 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.037551 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.065964 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.074497 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.082740 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.668729 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" path="/var/lib/kubelet/pods/1b5f7e2a-401c-4a9f-9222-5037f9d1d499/volumes" Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.669269 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" path="/var/lib/kubelet/pods/9bfc00cf-9a76-4b6f-a8f5-315af824814d/volumes" Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.029762 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.039303 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.050479 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.059118 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.068075 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.076686 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.085763 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.095052 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.669127 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" path="/var/lib/kubelet/pods/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c/volumes" Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.669674 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" path="/var/lib/kubelet/pods/20a3a1cb-c500-4355-ae67-649e381b1b88/volumes" Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.670287 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" path="/var/lib/kubelet/pods/2a7e4224-0922-4f9a-af94-0a9933f27530/volumes" Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.670918 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" path="/var/lib/kubelet/pods/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0/volumes" Mar 13 20:56:43 crc kubenswrapper[4790]: I0313 20:56:43.660233 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:43 crc kubenswrapper[4790]: E0313 20:56:43.661082 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:49 crc kubenswrapper[4790]: I0313 20:56:49.042610 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:56:49 crc kubenswrapper[4790]: I0313 20:56:49.052717 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:56:49 crc kubenswrapper[4790]: I0313 20:56:49.675535 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" path="/var/lib/kubelet/pods/cfa975ed-d42b-43be-91a1-4a2288005883/volumes" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.461092 4790 scope.go:117] "RemoveContainer" containerID="5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.482956 4790 scope.go:117] "RemoveContainer" containerID="684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.522558 4790 scope.go:117] "RemoveContainer" containerID="caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.546040 4790 scope.go:117] "RemoveContainer" containerID="e50d6c82675c18c36b9041dc6a13dffb21bb7a9c1cb73ee61c06ce0d61f0b9b3" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.587813 4790 scope.go:117] "RemoveContainer" containerID="eaedee9332ceb5ac2c43fa820fcea3e6086d5dfda3317381786c3cc819576b44" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.639807 4790 scope.go:117] "RemoveContainer" containerID="2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.691188 4790 scope.go:117] "RemoveContainer" containerID="3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.716629 4790 scope.go:117] "RemoveContainer" containerID="a46a82afe76ba100b2ac912d7fb0a03ce75de0a957f3543d9259571fea13e90c" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.755719 4790 scope.go:117] "RemoveContainer" containerID="4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.780426 4790 scope.go:117] "RemoveContainer" containerID="32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.799623 4790 scope.go:117] "RemoveContainer" containerID="7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.821407 4790 scope.go:117] "RemoveContainer" containerID="ff9f56b80e2e388086557f7fc707002adf2609bc96cff97367abf262894bf61f" Mar 13 20:56:58 crc kubenswrapper[4790]: I0313 20:56:58.660242 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:58 crc kubenswrapper[4790]: E0313 20:56:58.660974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:01 crc kubenswrapper[4790]: I0313 20:57:01.032569 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:57:01 crc kubenswrapper[4790]: I0313 20:57:01.042013 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:57:01 crc kubenswrapper[4790]: I0313 20:57:01.672179 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" path="/var/lib/kubelet/pods/a93720f0-c882-49d8-bd56-7d77237da6e7/volumes" Mar 13 20:57:11 crc kubenswrapper[4790]: I0313 20:57:11.660319 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:11 crc kubenswrapper[4790]: E0313 20:57:11.661159 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:15 crc kubenswrapper[4790]: I0313 20:57:15.050490 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:57:15 crc kubenswrapper[4790]: I0313 20:57:15.071303 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:57:15 crc kubenswrapper[4790]: I0313 20:57:15.671932 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" path="/var/lib/kubelet/pods/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c/volumes" Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.028668 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.042645 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.050167 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.060785 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.080552 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.080619 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.090513 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.100373 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.108251 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.115205 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.670299 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" path="/var/lib/kubelet/pods/1dd76b06-ea34-4044-bba0-cf5e6e822b6b/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.671124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" path="/var/lib/kubelet/pods/8e11abfd-7d59-479b-9f77-cbbd22cbf48c/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.671647 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e551be1a-728e-4851-894c-30b4493326d6" path="/var/lib/kubelet/pods/e551be1a-728e-4851-894c-30b4493326d6/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.672180 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" path="/var/lib/kubelet/pods/e7d496eb-3f17-4e7b-9a68-c91dec27355a/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.673165 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" path="/var/lib/kubelet/pods/fc51014c-323e-4a6b-9202-edc7b135809d/volumes" Mar 13 20:57:20 crc kubenswrapper[4790]: I0313 20:57:20.030248 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:57:20 crc kubenswrapper[4790]: I0313 20:57:20.040871 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:57:21 crc kubenswrapper[4790]: I0313 20:57:21.670812 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4214f238-4044-45ab-8e40-48894500f25f" path="/var/lib/kubelet/pods/4214f238-4044-45ab-8e40-48894500f25f/volumes" Mar 13 20:57:26 crc kubenswrapper[4790]: I0313 20:57:26.660173 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:26 crc kubenswrapper[4790]: E0313 20:57:26.661033 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:37 crc kubenswrapper[4790]: I0313 20:57:37.661039 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:37 crc kubenswrapper[4790]: E0313 20:57:37.661911 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:48 crc kubenswrapper[4790]: I0313 20:57:48.660044 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:48 crc kubenswrapper[4790]: E0313 20:57:48.660795 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:49 crc kubenswrapper[4790]: I0313 20:57:49.312612 4790 generic.go:334] "Generic (PLEG): container finished" podID="304addb4-f579-42f8-87d8-8e15b713aef2" containerID="c52e39aa381aa5d2fab1bd41bd98b85078dce93efb6dfc416500534a84998765" exitCode=0 Mar 13 20:57:49 crc kubenswrapper[4790]: I0313 20:57:49.312670 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerDied","Data":"c52e39aa381aa5d2fab1bd41bd98b85078dce93efb6dfc416500534a84998765"} Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.053307 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.060737 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.715277 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.827406 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"304addb4-f579-42f8-87d8-8e15b713aef2\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.827491 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"304addb4-f579-42f8-87d8-8e15b713aef2\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.827576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"304addb4-f579-42f8-87d8-8e15b713aef2\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.833551 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2" (OuterVolumeSpecName: "kube-api-access-57sh2") pod "304addb4-f579-42f8-87d8-8e15b713aef2" (UID: "304addb4-f579-42f8-87d8-8e15b713aef2"). InnerVolumeSpecName "kube-api-access-57sh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.858860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory" (OuterVolumeSpecName: "inventory") pod "304addb4-f579-42f8-87d8-8e15b713aef2" (UID: "304addb4-f579-42f8-87d8-8e15b713aef2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.859664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "304addb4-f579-42f8-87d8-8e15b713aef2" (UID: "304addb4-f579-42f8-87d8-8e15b713aef2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.931673 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.931949 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") on node \"crc\" DevicePath \"\"" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.932031 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.330509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerDied","Data":"18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf"} Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.330800 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.330619 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.460346 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564"] Mar 13 20:57:51 crc kubenswrapper[4790]: E0313 20:57:51.461482 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304addb4-f579-42f8-87d8-8e15b713aef2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.461507 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="304addb4-f579-42f8-87d8-8e15b713aef2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.462124 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="304addb4-f579-42f8-87d8-8e15b713aef2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.463532 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.497853 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.498131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.498306 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.498703 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.506235 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564"] Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.552176 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.552256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.552660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.654574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.655281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.655439 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.665313 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.671857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.675715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.675795 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" path="/var/lib/kubelet/pods/eef97bfb-4275-4a0a-bae4-5442cf7400dd/volumes" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.840279 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:52 crc kubenswrapper[4790]: I0313 20:57:52.400992 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564"] Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.359001 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerStarted","Data":"0f2da745b394b5be4861e2d2e60fb64fdc25fcc05f8c0e3c406f5c5afdec6971"} Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.359335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerStarted","Data":"33fb4801a3e818d29df5755724333f7626e2f157952f8e54477ff7fc99bb6957"} Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.385736 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" podStartSLOduration=1.908970958 podStartE2EDuration="2.385713512s" podCreationTimestamp="2026-03-13 20:57:51 +0000 UTC" firstStartedPulling="2026-03-13 20:57:52.407564734 +0000 UTC m=+1803.428680625" lastFinishedPulling="2026-03-13 20:57:52.884307288 +0000 UTC m=+1803.905423179" observedRunningTime="2026-03-13 20:57:53.378648368 +0000 UTC m=+1804.399764259" watchObservedRunningTime="2026-03-13 20:57:53.385713512 +0000 UTC m=+1804.406829423" Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.997303 4790 scope.go:117] "RemoveContainer" containerID="7f1ca4be311e4bf8899acd7ffc7b40f8dd562b652669b076fe646ca2df5ae15e" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.023572 4790 scope.go:117] "RemoveContainer" containerID="5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.077515 4790 scope.go:117] "RemoveContainer" containerID="37311d8f14a45460392cc2657752fc09be6fc325071ebe0626eb04d799e80545" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.111398 4790 scope.go:117] "RemoveContainer" containerID="0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.488225 4790 scope.go:117] "RemoveContainer" containerID="4f445f85254948b2a82910d93997f50d41021103d40e52ebd6447aec6a71de39" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.510630 4790 scope.go:117] "RemoveContainer" containerID="047c96b0959e792e896cbcb062d30482e777ac7ce2334a4427efe91c5a39d9a3" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.563856 4790 scope.go:117] "RemoveContainer" containerID="7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.596856 4790 scope.go:117] "RemoveContainer" containerID="fb829732267d5d36436612626f2036bb0698b4bd86f5c88383f3ee7aba396142" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.618158 4790 scope.go:117] "RemoveContainer" containerID="36f3978e6e158babd7d1c6c18b804e801c1d5a860c6298e1e465b9030818d00c" Mar 13 20:57:58 crc kubenswrapper[4790]: I0313 20:57:58.032810 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:57:58 crc kubenswrapper[4790]: I0313 20:57:58.041943 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:57:59 crc kubenswrapper[4790]: I0313 20:57:59.667888 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:59 crc kubenswrapper[4790]: E0313 20:57:59.668365 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:59 crc kubenswrapper[4790]: I0313 20:57:59.683007 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" path="/var/lib/kubelet/pods/e8b8bbca-4be9-43d3-b692-0587892a50b4/volumes" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.135857 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.138036 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.141056 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.141113 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.141064 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.143361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.184616 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"auto-csr-approver-29557258-gqmrr\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.285779 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"auto-csr-approver-29557258-gqmrr\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.305184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"auto-csr-approver-29557258-gqmrr\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.494897 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.942067 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 20:58:01 crc kubenswrapper[4790]: I0313 20:58:01.564593 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" event={"ID":"7706813b-e8e7-4b17-ba18-993c121eed66","Type":"ContainerStarted","Data":"27f1bb9cc432cac3351638db6116ab3783e0e90cad2ade62e350ed91b52d3746"} Mar 13 20:58:02 crc kubenswrapper[4790]: I0313 20:58:02.587733 4790 generic.go:334] "Generic (PLEG): container finished" podID="7706813b-e8e7-4b17-ba18-993c121eed66" containerID="98d6a341587e40eeb366a4b8a2eab51c3ea58fa67b5db767f9e2261febd34d64" exitCode=0 Mar 13 20:58:02 crc kubenswrapper[4790]: I0313 20:58:02.588085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" event={"ID":"7706813b-e8e7-4b17-ba18-993c121eed66","Type":"ContainerDied","Data":"98d6a341587e40eeb366a4b8a2eab51c3ea58fa67b5db767f9e2261febd34d64"} Mar 13 20:58:03 crc kubenswrapper[4790]: I0313 20:58:03.909914 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.035494 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.045066 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.057540 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"7706813b-e8e7-4b17-ba18-993c121eed66\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.064682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5" (OuterVolumeSpecName: "kube-api-access-g6st5") pod "7706813b-e8e7-4b17-ba18-993c121eed66" (UID: "7706813b-e8e7-4b17-ba18-993c121eed66"). InnerVolumeSpecName "kube-api-access-g6st5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.159439 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") on node \"crc\" DevicePath \"\"" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.608327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" event={"ID":"7706813b-e8e7-4b17-ba18-993c121eed66","Type":"ContainerDied","Data":"27f1bb9cc432cac3351638db6116ab3783e0e90cad2ade62e350ed91b52d3746"} Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.608365 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f1bb9cc432cac3351638db6116ab3783e0e90cad2ade62e350ed91b52d3746" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.608450 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.964530 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.971979 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:58:05 crc kubenswrapper[4790]: I0313 20:58:05.669528 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" path="/var/lib/kubelet/pods/b77751d8-7e07-4d67-9bed-3858cbfc5c3f/volumes" Mar 13 20:58:05 crc kubenswrapper[4790]: I0313 20:58:05.670426 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" path="/var/lib/kubelet/pods/dd2c3694-0492-400f-98bd-b3c641edfac0/volumes" Mar 13 20:58:09 crc kubenswrapper[4790]: I0313 20:58:09.038582 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:58:09 crc kubenswrapper[4790]: I0313 20:58:09.049157 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:58:09 crc kubenswrapper[4790]: I0313 20:58:09.671288 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" path="/var/lib/kubelet/pods/5dff6930-5d07-4df7-8d42-470ae83afd38/volumes" Mar 13 20:58:11 crc kubenswrapper[4790]: I0313 20:58:11.660344 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:11 crc kubenswrapper[4790]: E0313 20:58:11.661184 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:16 crc kubenswrapper[4790]: I0313 20:58:16.055835 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:58:16 crc kubenswrapper[4790]: I0313 20:58:16.063853 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:58:17 crc kubenswrapper[4790]: I0313 20:58:17.671751 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" path="/var/lib/kubelet/pods/32ffb609-7a3b-42b7-b513-7003deefe5dd/volumes" Mar 13 20:58:26 crc kubenswrapper[4790]: I0313 20:58:26.660519 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:26 crc kubenswrapper[4790]: E0313 20:58:26.661328 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:40 crc kubenswrapper[4790]: I0313 20:58:40.659495 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:40 crc kubenswrapper[4790]: E0313 20:58:40.660169 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:51 crc kubenswrapper[4790]: I0313 20:58:51.660197 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:51 crc kubenswrapper[4790]: E0313 20:58:51.661913 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:54 crc kubenswrapper[4790]: I0313 20:58:54.770907 4790 scope.go:117] "RemoveContainer" containerID="b5ea61f802c1b094e15351a6cc95042eca8f16ab2272c8f7af336afbb299a8d5" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.245407 4790 scope.go:117] "RemoveContainer" containerID="51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.281682 4790 scope.go:117] "RemoveContainer" containerID="de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.347335 4790 scope.go:117] "RemoveContainer" containerID="062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.378773 4790 scope.go:117] "RemoveContainer" containerID="f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9" Mar 13 20:59:01 crc kubenswrapper[4790]: I0313 20:59:01.085793 4790 generic.go:334] "Generic (PLEG): container finished" podID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerID="0f2da745b394b5be4861e2d2e60fb64fdc25fcc05f8c0e3c406f5c5afdec6971" exitCode=0 Mar 13 20:59:01 crc kubenswrapper[4790]: I0313 20:59:01.086314 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerDied","Data":"0f2da745b394b5be4861e2d2e60fb64fdc25fcc05f8c0e3c406f5c5afdec6971"} Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.054274 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.063639 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.071901 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.079525 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.087294 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.095369 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.491244 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.568206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.568273 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.568319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.573828 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7" (OuterVolumeSpecName: "kube-api-access-9s5f7") pod "c1609d29-96e5-43eb-a086-5587ca7c4f5a" (UID: "c1609d29-96e5-43eb-a086-5587ca7c4f5a"). InnerVolumeSpecName "kube-api-access-9s5f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.593784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory" (OuterVolumeSpecName: "inventory") pod "c1609d29-96e5-43eb-a086-5587ca7c4f5a" (UID: "c1609d29-96e5-43eb-a086-5587ca7c4f5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.604800 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c1609d29-96e5-43eb-a086-5587ca7c4f5a" (UID: "c1609d29-96e5-43eb-a086-5587ca7c4f5a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.670572 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.670621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.670630 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.035782 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.047756 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.056156 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.064361 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.072784 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.083282 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.103465 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerDied","Data":"33fb4801a3e818d29df5755724333f7626e2f157952f8e54477ff7fc99bb6957"} Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.103511 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33fb4801a3e818d29df5755724333f7626e2f157952f8e54477ff7fc99bb6957" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.103533 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.201890 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff"] Mar 13 20:59:03 crc kubenswrapper[4790]: E0313 20:59:03.202245 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" containerName="oc" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202264 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" containerName="oc" Mar 13 20:59:03 crc kubenswrapper[4790]: E0313 20:59:03.202289 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202297 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202503 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202534 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" containerName="oc" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.203141 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.210337 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.210542 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.210832 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.211108 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.217652 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.279193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.279284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.279314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: E0313 20:59:03.289092 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1609d29_96e5_43eb_a086_5587ca7c4f5a.slice\": RecentStats: unable to find data in memory cache]" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.381473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.381585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.381626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.387133 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.398149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.402891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.529127 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.671417 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" path="/var/lib/kubelet/pods/00f4f78b-ccfb-4413-9a81-d5b461a5e319/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.672225 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" path="/var/lib/kubelet/pods/1a4ef124-b4dd-43df-bdfb-97c65685977c/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.672842 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" path="/var/lib/kubelet/pods/536b2b85-21d0-47ba-8825-998dcb7b0058/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.673388 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" path="/var/lib/kubelet/pods/86c0a379-8f0b-4414-863c-eaed0745ce2d/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.674509 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" path="/var/lib/kubelet/pods/9c861107-6a1d-49f7-bc63-b95008ee5ddc/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.675064 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" path="/var/lib/kubelet/pods/dcc0f61e-f0ce-4443-9eec-0488ff92b388/volumes" Mar 13 20:59:04 crc kubenswrapper[4790]: W0313 20:59:04.044575 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20beb5d9_49e6_47c7_a3ad_107ff79e56fd.slice/crio-f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0 WatchSource:0}: Error finding container f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0: Status 404 returned error can't find the container with id f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0 Mar 13 20:59:04 crc kubenswrapper[4790]: I0313 20:59:04.045350 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff"] Mar 13 20:59:04 crc kubenswrapper[4790]: I0313 20:59:04.048525 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:59:04 crc kubenswrapper[4790]: I0313 20:59:04.111665 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerStarted","Data":"f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0"} Mar 13 20:59:05 crc kubenswrapper[4790]: I0313 20:59:05.121737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerStarted","Data":"21a384acf2e349aeba270f32d4e5aa2a5df9d099b690b8030fa9db6794e0997f"} Mar 13 20:59:05 crc kubenswrapper[4790]: I0313 20:59:05.143248 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" podStartSLOduration=1.4373358760000001 podStartE2EDuration="2.143228204s" podCreationTimestamp="2026-03-13 20:59:03 +0000 UTC" firstStartedPulling="2026-03-13 20:59:04.048253194 +0000 UTC m=+1875.069369075" lastFinishedPulling="2026-03-13 20:59:04.754145292 +0000 UTC m=+1875.775261403" observedRunningTime="2026-03-13 20:59:05.135150473 +0000 UTC m=+1876.156266364" watchObservedRunningTime="2026-03-13 20:59:05.143228204 +0000 UTC m=+1876.164344095" Mar 13 20:59:05 crc kubenswrapper[4790]: I0313 20:59:05.660367 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:05 crc kubenswrapper[4790]: E0313 20:59:05.660950 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:10 crc kubenswrapper[4790]: I0313 20:59:10.165277 4790 generic.go:334] "Generic (PLEG): container finished" podID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerID="21a384acf2e349aeba270f32d4e5aa2a5df9d099b690b8030fa9db6794e0997f" exitCode=0 Mar 13 20:59:10 crc kubenswrapper[4790]: I0313 20:59:10.165351 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerDied","Data":"21a384acf2e349aeba270f32d4e5aa2a5df9d099b690b8030fa9db6794e0997f"} Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.572677 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.640123 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.640319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.640357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.645844 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc" (OuterVolumeSpecName: "kube-api-access-7phtc") pod "20beb5d9-49e6-47c7-a3ad-107ff79e56fd" (UID: "20beb5d9-49e6-47c7-a3ad-107ff79e56fd"). InnerVolumeSpecName "kube-api-access-7phtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.666815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory" (OuterVolumeSpecName: "inventory") pod "20beb5d9-49e6-47c7-a3ad-107ff79e56fd" (UID: "20beb5d9-49e6-47c7-a3ad-107ff79e56fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.668200 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20beb5d9-49e6-47c7-a3ad-107ff79e56fd" (UID: "20beb5d9-49e6-47c7-a3ad-107ff79e56fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.742784 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.742823 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.742837 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.182268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerDied","Data":"f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0"} Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.182570 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.182416 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.246275 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b"] Mar 13 20:59:12 crc kubenswrapper[4790]: E0313 20:59:12.246757 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.246774 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.247002 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.247694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.250708 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.251511 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.251897 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.252285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.257755 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b"] Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.354435 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.354867 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.354914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.457510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.457591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.457721 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.461979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.462990 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.478111 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.565893 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:13 crc kubenswrapper[4790]: I0313 20:59:13.091003 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b"] Mar 13 20:59:13 crc kubenswrapper[4790]: I0313 20:59:13.190727 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerStarted","Data":"4c0f8c6ac936b6063988c5028348af88d79a2e886d04d046b7d47b0ee8ff6d1a"} Mar 13 20:59:14 crc kubenswrapper[4790]: I0313 20:59:14.203232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerStarted","Data":"681a7ba31862621718cb104c0d209c0de7ce953a8831d62f8f14d103d63a60a9"} Mar 13 20:59:14 crc kubenswrapper[4790]: I0313 20:59:14.226243 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" podStartSLOduration=1.816680792 podStartE2EDuration="2.226223958s" podCreationTimestamp="2026-03-13 20:59:12 +0000 UTC" firstStartedPulling="2026-03-13 20:59:13.09720017 +0000 UTC m=+1884.118316101" lastFinishedPulling="2026-03-13 20:59:13.506743376 +0000 UTC m=+1884.527859267" observedRunningTime="2026-03-13 20:59:14.223353469 +0000 UTC m=+1885.244469360" watchObservedRunningTime="2026-03-13 20:59:14.226223958 +0000 UTC m=+1885.247339869" Mar 13 20:59:17 crc kubenswrapper[4790]: I0313 20:59:17.661155 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:17 crc kubenswrapper[4790]: E0313 20:59:17.661935 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:28 crc kubenswrapper[4790]: I0313 20:59:28.660895 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:28 crc kubenswrapper[4790]: E0313 20:59:28.661709 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:35 crc kubenswrapper[4790]: I0313 20:59:35.037329 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:59:35 crc kubenswrapper[4790]: I0313 20:59:35.047653 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:59:35 crc kubenswrapper[4790]: I0313 20:59:35.671398 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" path="/var/lib/kubelet/pods/04b866fe-5d7d-46ab-9074-b93ddc7724f0/volumes" Mar 13 20:59:40 crc kubenswrapper[4790]: I0313 20:59:40.659745 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:40 crc kubenswrapper[4790]: E0313 20:59:40.660540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:48 crc kubenswrapper[4790]: I0313 20:59:48.521083 4790 generic.go:334] "Generic (PLEG): container finished" podID="04553c47-94a9-465f-a241-9188784794de" containerID="681a7ba31862621718cb104c0d209c0de7ce953a8831d62f8f14d103d63a60a9" exitCode=0 Mar 13 20:59:48 crc kubenswrapper[4790]: I0313 20:59:48.521116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerDied","Data":"681a7ba31862621718cb104c0d209c0de7ce953a8831d62f8f14d103d63a60a9"} Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.943482 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.987850 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.988070 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.988145 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.993323 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96" (OuterVolumeSpecName: "kube-api-access-s9f96") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de"). InnerVolumeSpecName "kube-api-access-s9f96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:59:50 crc kubenswrapper[4790]: E0313 20:59:50.011466 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory podName:04553c47-94a9-465f-a241-9188784794de nodeName:}" failed. No retries permitted until 2026-03-13 20:59:50.511429466 +0000 UTC m=+1921.532545357 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de") : error deleting /var/lib/kubelet/pods/04553c47-94a9-465f-a241-9188784794de/volume-subpaths: remove /var/lib/kubelet/pods/04553c47-94a9-465f-a241-9188784794de/volume-subpaths: no such file or directory Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.014078 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.090907 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.090942 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.539347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerDied","Data":"4c0f8c6ac936b6063988c5028348af88d79a2e886d04d046b7d47b0ee8ff6d1a"} Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.539448 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c0f8c6ac936b6063988c5028348af88d79a2e886d04d046b7d47b0ee8ff6d1a" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.539503 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.600477 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.603889 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory" (OuterVolumeSpecName: "inventory") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.641300 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk"] Mar 13 20:59:50 crc kubenswrapper[4790]: E0313 20:59:50.641781 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04553c47-94a9-465f-a241-9188784794de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.641800 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="04553c47-94a9-465f-a241-9188784794de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.642028 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="04553c47-94a9-465f-a241-9188784794de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.642831 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.647411 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk"] Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.702246 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.702670 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.702788 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.703061 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.805305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.805360 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.805403 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.809559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.809810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.821635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.984632 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:51 crc kubenswrapper[4790]: I0313 20:59:51.478116 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk"] Mar 13 20:59:51 crc kubenswrapper[4790]: W0313 20:59:51.481021 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e18dc0_dbbb_419e_bdad_22b5f08ffa6f.slice/crio-a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256 WatchSource:0}: Error finding container a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256: Status 404 returned error can't find the container with id a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256 Mar 13 20:59:51 crc kubenswrapper[4790]: I0313 20:59:51.549597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerStarted","Data":"a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256"} Mar 13 20:59:52 crc kubenswrapper[4790]: I0313 20:59:52.560490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerStarted","Data":"c8ddee344e61f57e55bf975cc9ff728e15bf1f3150e4544252973126358814b9"} Mar 13 20:59:52 crc kubenswrapper[4790]: I0313 20:59:52.578526 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" podStartSLOduration=2.091700361 podStartE2EDuration="2.578509463s" podCreationTimestamp="2026-03-13 20:59:50 +0000 UTC" firstStartedPulling="2026-03-13 20:59:51.483593953 +0000 UTC m=+1922.504709844" lastFinishedPulling="2026-03-13 20:59:51.970403055 +0000 UTC m=+1922.991518946" observedRunningTime="2026-03-13 20:59:52.577334921 +0000 UTC m=+1923.598450832" watchObservedRunningTime="2026-03-13 20:59:52.578509463 +0000 UTC m=+1923.599625354" Mar 13 20:59:52 crc kubenswrapper[4790]: I0313 20:59:52.660238 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.043537 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.051158 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.578996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4"} Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.672002 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71d98c3-e247-448e-945e-016a6755c689" path="/var/lib/kubelet/pods/e71d98c3-e247-448e-945e-016a6755c689/volumes" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.024938 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.032991 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.559282 4790 scope.go:117] "RemoveContainer" containerID="ac99b8592ceb7c3e6a37fbb0c9de0300f9c9ee5a2b4807abffe2d2ed52e8fe04" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.584462 4790 scope.go:117] "RemoveContainer" containerID="6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.646355 4790 scope.go:117] "RemoveContainer" containerID="749c82e4067fc52a2714101b9401b4c82b0470e8a2bd0821a82732111bf3a2ae" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.672352 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" path="/var/lib/kubelet/pods/255451e0-9cb8-424f-a327-6e7ef4e4d775/volumes" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.673693 4790 scope.go:117] "RemoveContainer" containerID="670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.738344 4790 scope.go:117] "RemoveContainer" containerID="2532c9c9471a4f51d2c72742172102590d5f8b86465110fbcffff19c31b75b68" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.761018 4790 scope.go:117] "RemoveContainer" containerID="ecda3f7499b0977157d22e381725d43a5571bfd9425676b723008c4d5d967330" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.802953 4790 scope.go:117] "RemoveContainer" containerID="c0e58f35f1d7b48efbdbbc91a297aa591c210bb71e60644cb81c14c40a9e45cb" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.853612 4790 scope.go:117] "RemoveContainer" containerID="d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.911684 4790 scope.go:117] "RemoveContainer" containerID="15f4fd3d9e2092ff500a17b34ac7be646f532a2e8275aea162c7ec8133dbdbed" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.146421 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.148353 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.150669 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.150668 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.153461 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.155400 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.156733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.159257 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.162267 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.168039 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.178249 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213516 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"auto-csr-approver-29557260-m6wtk\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213963 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"auto-csr-approver-29557260-m6wtk\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315500 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315565 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.316228 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.327328 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.332947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"auto-csr-approver-29557260-m6wtk\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.332948 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.469239 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.481599 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.922429 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:00:01 crc kubenswrapper[4790]: W0313 21:00:01.016042 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod427c23ef_3e13_432b_98b4_08a6aa5b7cff.slice/crio-e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32 WatchSource:0}: Error finding container e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32: Status 404 returned error can't find the container with id e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32 Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.019065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf"] Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.668468 4790 generic.go:334] "Generic (PLEG): container finished" podID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerID="6b6ac28f388fd53f46ab8b3943c1bd45ee090848b8aabd597c3e5c7ae5662495" exitCode=0 Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.669490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerStarted","Data":"e5c5b460ceb349db78baa661bfdbefbccd67389a158be1d27ad4b75995e8252b"} Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.669528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" event={"ID":"427c23ef-3e13-432b-98b4-08a6aa5b7cff","Type":"ContainerDied","Data":"6b6ac28f388fd53f46ab8b3943c1bd45ee090848b8aabd597c3e5c7ae5662495"} Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.669540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" event={"ID":"427c23ef-3e13-432b-98b4-08a6aa5b7cff","Type":"ContainerStarted","Data":"e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32"} Mar 13 21:00:02 crc kubenswrapper[4790]: I0313 21:00:02.992003 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.067169 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.067255 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.067338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.068625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume" (OuterVolumeSpecName: "config-volume") pod "427c23ef-3e13-432b-98b4-08a6aa5b7cff" (UID: "427c23ef-3e13-432b-98b4-08a6aa5b7cff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.073992 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "427c23ef-3e13-432b-98b4-08a6aa5b7cff" (UID: "427c23ef-3e13-432b-98b4-08a6aa5b7cff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.074280 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96" (OuterVolumeSpecName: "kube-api-access-n7m96") pod "427c23ef-3e13-432b-98b4-08a6aa5b7cff" (UID: "427c23ef-3e13-432b-98b4-08a6aa5b7cff"). InnerVolumeSpecName "kube-api-access-n7m96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.169657 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.169706 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.169718 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.691616 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" event={"ID":"427c23ef-3e13-432b-98b4-08a6aa5b7cff","Type":"ContainerDied","Data":"e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32"} Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.692155 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.692263 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:04 crc kubenswrapper[4790]: I0313 21:00:04.700133 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerStarted","Data":"5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1"} Mar 13 21:00:04 crc kubenswrapper[4790]: I0313 21:00:04.714016 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" podStartSLOduration=1.257896753 podStartE2EDuration="4.713995785s" podCreationTimestamp="2026-03-13 21:00:00 +0000 UTC" firstStartedPulling="2026-03-13 21:00:00.930471785 +0000 UTC m=+1931.951587676" lastFinishedPulling="2026-03-13 21:00:04.386570817 +0000 UTC m=+1935.407686708" observedRunningTime="2026-03-13 21:00:04.713504211 +0000 UTC m=+1935.734620102" watchObservedRunningTime="2026-03-13 21:00:04.713995785 +0000 UTC m=+1935.735111676" Mar 13 21:00:05 crc kubenswrapper[4790]: I0313 21:00:05.712801 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerID="5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1" exitCode=0 Mar 13 21:00:05 crc kubenswrapper[4790]: I0313 21:00:05.712868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerDied","Data":"5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1"} Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.070319 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.139912 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.145568 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt" (OuterVolumeSpecName: "kube-api-access-77wxt") pod "6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" (UID: "6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0"). InnerVolumeSpecName "kube-api-access-77wxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.243589 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.739736 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerDied","Data":"e5c5b460ceb349db78baa661bfdbefbccd67389a158be1d27ad4b75995e8252b"} Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.739776 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c5b460ceb349db78baa661bfdbefbccd67389a158be1d27ad4b75995e8252b" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.739962 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.775367 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.784121 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 21:00:09 crc kubenswrapper[4790]: I0313 21:00:09.676859 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" path="/var/lib/kubelet/pods/173eb1b0-728a-4420-bfab-ba33ae08f5eb/volumes" Mar 13 21:00:36 crc kubenswrapper[4790]: I0313 21:00:36.001639 4790 generic.go:334] "Generic (PLEG): container finished" podID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerID="c8ddee344e61f57e55bf975cc9ff728e15bf1f3150e4544252973126358814b9" exitCode=0 Mar 13 21:00:36 crc kubenswrapper[4790]: I0313 21:00:36.001752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerDied","Data":"c8ddee344e61f57e55bf975cc9ff728e15bf1f3150e4544252973126358814b9"} Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.046576 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.056908 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.413054 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.549128 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.549367 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.549473 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.555323 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56" (OuterVolumeSpecName: "kube-api-access-hsn56") pod "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" (UID: "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f"). InnerVolumeSpecName "kube-api-access-hsn56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.579137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" (UID: "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.581595 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory" (OuterVolumeSpecName: "inventory") pod "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" (UID: "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.652140 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.652176 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.652186 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.670895 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" path="/var/lib/kubelet/pods/263e3744-6b98-4d91-aba2-cd28a616d9df/volumes" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.025177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerDied","Data":"a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256"} Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.025213 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.025600 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.124125 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pdxrj"] Mar 13 21:00:38 crc kubenswrapper[4790]: E0313 21:00:38.125244 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerName="collect-profiles" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125323 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerName="collect-profiles" Mar 13 21:00:38 crc kubenswrapper[4790]: E0313 21:00:38.125398 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerName="oc" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125453 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerName="oc" Mar 13 21:00:38 crc kubenswrapper[4790]: E0313 21:00:38.125521 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125603 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125864 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125945 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerName="collect-profiles" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.126011 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerName="oc" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.126714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.129701 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.129943 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.130164 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.130221 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.137706 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pdxrj"] Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.266302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.266395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.266495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.370193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.370248 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.370338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.379335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.381607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.388206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.443330 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.906819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pdxrj"] Mar 13 21:00:39 crc kubenswrapper[4790]: I0313 21:00:39.034842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerStarted","Data":"5f6755c2ce51cca35693a7909f948d2dc09cc15243764bfb0a65cce83e1980ba"} Mar 13 21:00:40 crc kubenswrapper[4790]: I0313 21:00:40.044024 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerStarted","Data":"77d248b88c2d0c4526ca72adc06ae8f841c05ba15cefbf5df17827e6142b336f"} Mar 13 21:00:40 crc kubenswrapper[4790]: I0313 21:00:40.066232 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" podStartSLOduration=1.6241604170000001 podStartE2EDuration="2.066215097s" podCreationTimestamp="2026-03-13 21:00:38 +0000 UTC" firstStartedPulling="2026-03-13 21:00:38.917236088 +0000 UTC m=+1969.938351979" lastFinishedPulling="2026-03-13 21:00:39.359290768 +0000 UTC m=+1970.380406659" observedRunningTime="2026-03-13 21:00:40.063131913 +0000 UTC m=+1971.084247814" watchObservedRunningTime="2026-03-13 21:00:40.066215097 +0000 UTC m=+1971.087330988" Mar 13 21:00:46 crc kubenswrapper[4790]: I0313 21:00:46.094687 4790 generic.go:334] "Generic (PLEG): container finished" podID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerID="77d248b88c2d0c4526ca72adc06ae8f841c05ba15cefbf5df17827e6142b336f" exitCode=0 Mar 13 21:00:46 crc kubenswrapper[4790]: I0313 21:00:46.094793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerDied","Data":"77d248b88c2d0c4526ca72adc06ae8f841c05ba15cefbf5df17827e6142b336f"} Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.533753 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.638115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.638263 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.638402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.649920 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4" (OuterVolumeSpecName: "kube-api-access-96gt4") pod "ee77aab6-b3c2-4925-a715-428a4c5e5bd9" (UID: "ee77aab6-b3c2-4925-a715-428a4c5e5bd9"). InnerVolumeSpecName "kube-api-access-96gt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.665516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ee77aab6-b3c2-4925-a715-428a4c5e5bd9" (UID: "ee77aab6-b3c2-4925-a715-428a4c5e5bd9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.670742 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ee77aab6-b3c2-4925-a715-428a4c5e5bd9" (UID: "ee77aab6-b3c2-4925-a715-428a4c5e5bd9"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.741582 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.741631 4790 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.741642 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.112162 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerDied","Data":"5f6755c2ce51cca35693a7909f948d2dc09cc15243764bfb0a65cce83e1980ba"} Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.112202 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f6755c2ce51cca35693a7909f948d2dc09cc15243764bfb0a65cce83e1980ba" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.112253 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.177173 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8"] Mar 13 21:00:48 crc kubenswrapper[4790]: E0313 21:00:48.177755 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerName="ssh-known-hosts-edpm-deployment" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.177783 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerName="ssh-known-hosts-edpm-deployment" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.178045 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerName="ssh-known-hosts-edpm-deployment" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.178878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.182560 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.182818 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.182892 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.183291 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.191880 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8"] Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.353526 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.353743 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.354076 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.456423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.456813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.456942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.463003 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.464410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.473062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.505261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:49 crc kubenswrapper[4790]: I0313 21:00:49.019208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8"] Mar 13 21:00:49 crc kubenswrapper[4790]: I0313 21:00:49.121198 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerStarted","Data":"bdebea233f23ef8d949a327f2db14c145b93e2ed9f847d431dd693279b443afa"} Mar 13 21:00:49 crc kubenswrapper[4790]: I0313 21:00:49.748116 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:00:50 crc kubenswrapper[4790]: I0313 21:00:50.133537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerStarted","Data":"3dbda74f382829d90b8f3ab821618c380a983071e325c3af0881d8d1ec980d7e"} Mar 13 21:00:50 crc kubenswrapper[4790]: I0313 21:00:50.159627 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" podStartSLOduration=1.4456798530000001 podStartE2EDuration="2.159595113s" podCreationTimestamp="2026-03-13 21:00:48 +0000 UTC" firstStartedPulling="2026-03-13 21:00:49.031862945 +0000 UTC m=+1980.052978836" lastFinishedPulling="2026-03-13 21:00:49.745778205 +0000 UTC m=+1980.766894096" observedRunningTime="2026-03-13 21:00:50.147603564 +0000 UTC m=+1981.168719455" watchObservedRunningTime="2026-03-13 21:00:50.159595113 +0000 UTC m=+1981.180711034" Mar 13 21:00:56 crc kubenswrapper[4790]: I0313 21:00:56.068446 4790 scope.go:117] "RemoveContainer" containerID="1354228427a90e6609d9b0170fc1b61342fc6ff24449709c9abd0f642ea90a66" Mar 13 21:00:56 crc kubenswrapper[4790]: I0313 21:00:56.108354 4790 scope.go:117] "RemoveContainer" containerID="4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570" Mar 13 21:00:58 crc kubenswrapper[4790]: I0313 21:00:58.225238 4790 generic.go:334] "Generic (PLEG): container finished" podID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerID="3dbda74f382829d90b8f3ab821618c380a983071e325c3af0881d8d1ec980d7e" exitCode=0 Mar 13 21:00:58 crc kubenswrapper[4790]: I0313 21:00:58.225449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerDied","Data":"3dbda74f382829d90b8f3ab821618c380a983071e325c3af0881d8d1ec980d7e"} Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.656520 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.670313 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"d19bd67c-441b-4813-8cc3-07c8cf446e42\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.670750 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"d19bd67c-441b-4813-8cc3-07c8cf446e42\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.670904 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"d19bd67c-441b-4813-8cc3-07c8cf446e42\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.687304 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr" (OuterVolumeSpecName: "kube-api-access-xx9nr") pod "d19bd67c-441b-4813-8cc3-07c8cf446e42" (UID: "d19bd67c-441b-4813-8cc3-07c8cf446e42"). InnerVolumeSpecName "kube-api-access-xx9nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.713536 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory" (OuterVolumeSpecName: "inventory") pod "d19bd67c-441b-4813-8cc3-07c8cf446e42" (UID: "d19bd67c-441b-4813-8cc3-07c8cf446e42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.716673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d19bd67c-441b-4813-8cc3-07c8cf446e42" (UID: "d19bd67c-441b-4813-8cc3-07c8cf446e42"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.773740 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.773774 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.773784 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.141845 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29557261-5pp9q"] Mar 13 21:01:00 crc kubenswrapper[4790]: E0313 21:01:00.142660 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.142684 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.142867 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.144481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.152885 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557261-5pp9q"] Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180070 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.264914 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerDied","Data":"bdebea233f23ef8d949a327f2db14c145b93e2ed9f847d431dd693279b443afa"} Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.264952 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdebea233f23ef8d949a327f2db14c145b93e2ed9f847d431dd693279b443afa" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.265016 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282223 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.294857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.295808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.300252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.314269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.395904 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt"] Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.397197 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.403764 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.404050 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.404227 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.404439 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.414951 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt"] Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.464912 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.588703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.589025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.589095 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.691088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.691143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.691206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.700527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.712492 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.713610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.720043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.922092 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557261-5pp9q"] Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.250679 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt"] Mar 13 21:01:01 crc kubenswrapper[4790]: W0313 21:01:01.252710 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb0d614_f5d9_4862_8059_ad323eec6c59.slice/crio-285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423 WatchSource:0}: Error finding container 285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423: Status 404 returned error can't find the container with id 285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423 Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.285265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerStarted","Data":"983aa109c38a604ee34bc992ac30687ce8449ee6da1a3d0137206237482d8e8f"} Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.285310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerStarted","Data":"f535c2e9bfbdd2df1c7cf36740fcc50bfa9a5f7bfe93e05fcf8c23101a3e8eec"} Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.288322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerStarted","Data":"285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423"} Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.313872 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29557261-5pp9q" podStartSLOduration=1.3138463439999999 podStartE2EDuration="1.313846344s" podCreationTimestamp="2026-03-13 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 21:01:01.30015951 +0000 UTC m=+1992.321275411" watchObservedRunningTime="2026-03-13 21:01:01.313846344 +0000 UTC m=+1992.334962235" Mar 13 21:01:02 crc kubenswrapper[4790]: I0313 21:01:02.299018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerStarted","Data":"54eadb04d171d0b6ed335700084015547ac87a7b17db8895122b9015adb30fc3"} Mar 13 21:01:02 crc kubenswrapper[4790]: I0313 21:01:02.328962 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" podStartSLOduration=1.870610878 podStartE2EDuration="2.328944926s" podCreationTimestamp="2026-03-13 21:01:00 +0000 UTC" firstStartedPulling="2026-03-13 21:01:01.260526013 +0000 UTC m=+1992.281641894" lastFinishedPulling="2026-03-13 21:01:01.718860051 +0000 UTC m=+1992.739975942" observedRunningTime="2026-03-13 21:01:02.326598502 +0000 UTC m=+1993.347714393" watchObservedRunningTime="2026-03-13 21:01:02.328944926 +0000 UTC m=+1993.350060817" Mar 13 21:01:03 crc kubenswrapper[4790]: I0313 21:01:03.308906 4790 generic.go:334] "Generic (PLEG): container finished" podID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerID="983aa109c38a604ee34bc992ac30687ce8449ee6da1a3d0137206237482d8e8f" exitCode=0 Mar 13 21:01:03 crc kubenswrapper[4790]: I0313 21:01:03.309157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerDied","Data":"983aa109c38a604ee34bc992ac30687ce8449ee6da1a3d0137206237482d8e8f"} Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.673908 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776279 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776399 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.782238 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b" (OuterVolumeSpecName: "kube-api-access-x885b") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "kube-api-access-x885b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.782288 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.805363 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.833215 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data" (OuterVolumeSpecName: "config-data") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878103 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878145 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878155 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878167 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:05 crc kubenswrapper[4790]: I0313 21:01:05.326177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerDied","Data":"f535c2e9bfbdd2df1c7cf36740fcc50bfa9a5f7bfe93e05fcf8c23101a3e8eec"} Mar 13 21:01:05 crc kubenswrapper[4790]: I0313 21:01:05.326219 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f535c2e9bfbdd2df1c7cf36740fcc50bfa9a5f7bfe93e05fcf8c23101a3e8eec" Mar 13 21:01:05 crc kubenswrapper[4790]: I0313 21:01:05.326270 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:11 crc kubenswrapper[4790]: I0313 21:01:11.375423 4790 generic.go:334] "Generic (PLEG): container finished" podID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerID="54eadb04d171d0b6ed335700084015547ac87a7b17db8895122b9015adb30fc3" exitCode=0 Mar 13 21:01:11 crc kubenswrapper[4790]: I0313 21:01:11.375494 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerDied","Data":"54eadb04d171d0b6ed335700084015547ac87a7b17db8895122b9015adb30fc3"} Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.783868 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.853604 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"7cb0d614-f5d9-4862-8059-ad323eec6c59\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.853807 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"7cb0d614-f5d9-4862-8059-ad323eec6c59\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.853872 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"7cb0d614-f5d9-4862-8059-ad323eec6c59\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.859775 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr" (OuterVolumeSpecName: "kube-api-access-c6ljr") pod "7cb0d614-f5d9-4862-8059-ad323eec6c59" (UID: "7cb0d614-f5d9-4862-8059-ad323eec6c59"). InnerVolumeSpecName "kube-api-access-c6ljr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.882466 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory" (OuterVolumeSpecName: "inventory") pod "7cb0d614-f5d9-4862-8059-ad323eec6c59" (UID: "7cb0d614-f5d9-4862-8059-ad323eec6c59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.888415 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7cb0d614-f5d9-4862-8059-ad323eec6c59" (UID: "7cb0d614-f5d9-4862-8059-ad323eec6c59"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.958272 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.958310 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.958321 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.392996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerDied","Data":"285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423"} Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.393036 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.393054 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471149 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5"] Mar 13 21:01:13 crc kubenswrapper[4790]: E0313 21:01:13.471527 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471543 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: E0313 21:01:13.471578 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerName="keystone-cron" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471586 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerName="keystone-cron" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471771 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerName="keystone-cron" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471787 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.472369 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.475271 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.475655 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.475779 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.476738 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.476880 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.476817 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.477552 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.477643 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.494674 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5"] Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571373 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571434 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571583 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571711 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571861 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.673922 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.673961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.673991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674053 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674312 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.679036 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.681960 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.681969 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682514 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.683214 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.683844 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.688761 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.690774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.692484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.694998 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.790685 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:14 crc kubenswrapper[4790]: I0313 21:01:14.300976 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5"] Mar 13 21:01:14 crc kubenswrapper[4790]: I0313 21:01:14.403362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerStarted","Data":"426f65e9c26d39b596dc573505d7b808196f8c14f2fa3cc5ae9289099b5aa2e9"} Mar 13 21:01:15 crc kubenswrapper[4790]: I0313 21:01:15.413832 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerStarted","Data":"33421c8bca23196e37ea69a9dbe3facb2e231c9ca13be229cc160b45432ee770"} Mar 13 21:01:15 crc kubenswrapper[4790]: I0313 21:01:15.439839 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" podStartSLOduration=2.007804467 podStartE2EDuration="2.439820273s" podCreationTimestamp="2026-03-13 21:01:13 +0000 UTC" firstStartedPulling="2026-03-13 21:01:14.304247172 +0000 UTC m=+2005.325363053" lastFinishedPulling="2026-03-13 21:01:14.736262968 +0000 UTC m=+2005.757378859" observedRunningTime="2026-03-13 21:01:15.434124147 +0000 UTC m=+2006.455240038" watchObservedRunningTime="2026-03-13 21:01:15.439820273 +0000 UTC m=+2006.460936164" Mar 13 21:01:47 crc kubenswrapper[4790]: I0313 21:01:47.678741 4790 generic.go:334] "Generic (PLEG): container finished" podID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerID="33421c8bca23196e37ea69a9dbe3facb2e231c9ca13be229cc160b45432ee770" exitCode=0 Mar 13 21:01:47 crc kubenswrapper[4790]: I0313 21:01:47.678833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerDied","Data":"33421c8bca23196e37ea69a9dbe3facb2e231c9ca13be229cc160b45432ee770"} Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.081719 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.232541 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.232933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.232968 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233789 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233854 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233889 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233924 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.235862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.236350 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.236539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.236652 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241408 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241692 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt" (OuterVolumeSpecName: "kube-api-access-qs5kt") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "kube-api-access-qs5kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241738 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241883 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.242888 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.244473 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.244485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.244769 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.245080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.246740 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.272817 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.275339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory" (OuterVolumeSpecName: "inventory") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339684 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339723 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339734 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339743 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339754 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339762 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339773 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339783 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339793 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339801 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339809 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339818 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339828 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339836 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.697719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerDied","Data":"426f65e9c26d39b596dc573505d7b808196f8c14f2fa3cc5ae9289099b5aa2e9"} Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.697762 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.697772 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="426f65e9c26d39b596dc573505d7b808196f8c14f2fa3cc5ae9289099b5aa2e9" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.789086 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq"] Mar 13 21:01:49 crc kubenswrapper[4790]: E0313 21:01:49.789481 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.789499 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.789720 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.790300 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.794818 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795211 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795400 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795666 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795975 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.803010 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq"] Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.951842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952470 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054109 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.055250 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.062322 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.062425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.062494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.074707 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.121907 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.640240 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq"] Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.708888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerStarted","Data":"e338edf26926f4ba8724c786baa1f3ad6cf40efaff64a89683a9459869e28ac9"} Mar 13 21:01:51 crc kubenswrapper[4790]: I0313 21:01:51.719192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerStarted","Data":"f70c8e8a59a2069c4d31e1d743181c1e04d84701a48632b52e0eea3913ba3e41"} Mar 13 21:01:51 crc kubenswrapper[4790]: I0313 21:01:51.748344 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" podStartSLOduration=2.295100786 podStartE2EDuration="2.748322754s" podCreationTimestamp="2026-03-13 21:01:49 +0000 UTC" firstStartedPulling="2026-03-13 21:01:50.642903327 +0000 UTC m=+2041.664019218" lastFinishedPulling="2026-03-13 21:01:51.096125295 +0000 UTC m=+2042.117241186" observedRunningTime="2026-03-13 21:01:51.739680967 +0000 UTC m=+2042.760796858" watchObservedRunningTime="2026-03-13 21:01:51.748322754 +0000 UTC m=+2042.769438645" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.141992 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.143745 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.145770 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.145837 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.146032 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.150934 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.250757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"auto-csr-approver-29557262-wdkmw\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.353110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"auto-csr-approver-29557262-wdkmw\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.378016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"auto-csr-approver-29557262-wdkmw\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.465873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.903100 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:02:01 crc kubenswrapper[4790]: I0313 21:02:01.796388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" event={"ID":"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0","Type":"ContainerStarted","Data":"5a26ab7c7362ccce69b1823ca5d7bdfbecdeeb4170dee6dc1ab0d459ae997f3b"} Mar 13 21:02:02 crc kubenswrapper[4790]: I0313 21:02:02.806040 4790 generic.go:334] "Generic (PLEG): container finished" podID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerID="c9fc9237e156eb0becb6b2dc2279bf5dc16eec046e67e33454f05890e75163e2" exitCode=0 Mar 13 21:02:02 crc kubenswrapper[4790]: I0313 21:02:02.806107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" event={"ID":"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0","Type":"ContainerDied","Data":"c9fc9237e156eb0becb6b2dc2279bf5dc16eec046e67e33454f05890e75163e2"} Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.142972 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.236920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.246794 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h" (OuterVolumeSpecName: "kube-api-access-7mm5h") pod "5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" (UID: "5b5219d2-3afd-4a8d-ab26-3102b6dee3b0"). InnerVolumeSpecName "kube-api-access-7mm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.339618 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.824544 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" event={"ID":"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0","Type":"ContainerDied","Data":"5a26ab7c7362ccce69b1823ca5d7bdfbecdeeb4170dee6dc1ab0d459ae997f3b"} Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.824863 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a26ab7c7362ccce69b1823ca5d7bdfbecdeeb4170dee6dc1ab0d459ae997f3b" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.824649 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:05 crc kubenswrapper[4790]: I0313 21:02:05.211945 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 21:02:05 crc kubenswrapper[4790]: I0313 21:02:05.219805 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 21:02:05 crc kubenswrapper[4790]: I0313 21:02:05.674563 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" path="/var/lib/kubelet/pods/4ffc58ad-c12d-4165-bc92-1e948aa14c42/volumes" Mar 13 21:02:14 crc kubenswrapper[4790]: I0313 21:02:14.015479 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:02:14 crc kubenswrapper[4790]: I0313 21:02:14.015963 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:02:44 crc kubenswrapper[4790]: I0313 21:02:44.015487 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:02:44 crc kubenswrapper[4790]: I0313 21:02:44.016463 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:02:48 crc kubenswrapper[4790]: I0313 21:02:48.222930 4790 generic.go:334] "Generic (PLEG): container finished" podID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerID="f70c8e8a59a2069c4d31e1d743181c1e04d84701a48632b52e0eea3913ba3e41" exitCode=0 Mar 13 21:02:48 crc kubenswrapper[4790]: I0313 21:02:48.223020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerDied","Data":"f70c8e8a59a2069c4d31e1d743181c1e04d84701a48632b52e0eea3913ba3e41"} Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.647977 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.728737 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.728808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.728852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.729069 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.729108 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.735777 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.737792 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh" (OuterVolumeSpecName: "kube-api-access-57bxh") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "kube-api-access-57bxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.757542 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.765818 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.774093 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory" (OuterVolumeSpecName: "inventory") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832421 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832501 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832515 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832531 4790 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832544 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.245733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerDied","Data":"e338edf26926f4ba8724c786baa1f3ad6cf40efaff64a89683a9459869e28ac9"} Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.246065 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e338edf26926f4ba8724c786baa1f3ad6cf40efaff64a89683a9459869e28ac9" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.245777 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.443499 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4"] Mar 13 21:02:50 crc kubenswrapper[4790]: E0313 21:02:50.443856 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.443872 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: E0313 21:02:50.443887 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerName="oc" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.443894 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerName="oc" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.444084 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerName="oc" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.444101 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.444759 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.447715 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.447738 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.447744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.448312 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.449609 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.456635 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4"] Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.512693 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546525 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648261 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648671 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648713 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.652818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.653003 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.655510 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.658310 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.659346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.666578 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.823700 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:51 crc kubenswrapper[4790]: I0313 21:02:51.367594 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4"] Mar 13 21:02:52 crc kubenswrapper[4790]: I0313 21:02:52.266138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerStarted","Data":"e12be54f325e5426cb892a229deff495e48abdb2e1c4e0dad6fe222c62342b3b"} Mar 13 21:02:52 crc kubenswrapper[4790]: I0313 21:02:52.266199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerStarted","Data":"0dafa60c26fe92544b98d93c0ab98afa5193df785bac54fff3b546f1a811fd97"} Mar 13 21:02:52 crc kubenswrapper[4790]: I0313 21:02:52.314435 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" podStartSLOduration=1.857272823 podStartE2EDuration="2.314410208s" podCreationTimestamp="2026-03-13 21:02:50 +0000 UTC" firstStartedPulling="2026-03-13 21:02:51.374798044 +0000 UTC m=+2102.395913935" lastFinishedPulling="2026-03-13 21:02:51.831935429 +0000 UTC m=+2102.853051320" observedRunningTime="2026-03-13 21:02:52.307414376 +0000 UTC m=+2103.328530277" watchObservedRunningTime="2026-03-13 21:02:52.314410208 +0000 UTC m=+2103.335526099" Mar 13 21:02:56 crc kubenswrapper[4790]: I0313 21:02:56.828507 4790 scope.go:117] "RemoveContainer" containerID="089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.015959 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.016581 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.016650 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.017513 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.017574 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4" gracePeriod=600 Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.016944 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4" exitCode=0 Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.017009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4"} Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.018665 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5"} Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.018719 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.208543 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.211083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.236646 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.339245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.339559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.339824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442516 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442655 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.476193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.587706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:18 crc kubenswrapper[4790]: I0313 21:03:18.076401 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:19 crc kubenswrapper[4790]: I0313 21:03:19.053067 4790 generic.go:334] "Generic (PLEG): container finished" podID="22852680-9cbb-4ca0-817d-d8391e019c99" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" exitCode=0 Mar 13 21:03:19 crc kubenswrapper[4790]: I0313 21:03:19.053268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84"} Mar 13 21:03:19 crc kubenswrapper[4790]: I0313 21:03:19.053610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerStarted","Data":"ab4c725566fad1ec1f3ea2b58848dc912abb41b5c8e1dbefa92e0bd281a81a63"} Mar 13 21:03:20 crc kubenswrapper[4790]: I0313 21:03:20.063187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerStarted","Data":"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f"} Mar 13 21:03:21 crc kubenswrapper[4790]: I0313 21:03:21.074165 4790 generic.go:334] "Generic (PLEG): container finished" podID="22852680-9cbb-4ca0-817d-d8391e019c99" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" exitCode=0 Mar 13 21:03:21 crc kubenswrapper[4790]: I0313 21:03:21.074478 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f"} Mar 13 21:03:23 crc kubenswrapper[4790]: I0313 21:03:23.094691 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerStarted","Data":"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56"} Mar 13 21:03:23 crc kubenswrapper[4790]: I0313 21:03:23.120112 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2g4x4" podStartSLOduration=3.328190068 podStartE2EDuration="6.12008583s" podCreationTimestamp="2026-03-13 21:03:17 +0000 UTC" firstStartedPulling="2026-03-13 21:03:19.054964755 +0000 UTC m=+2130.076080646" lastFinishedPulling="2026-03-13 21:03:21.846860507 +0000 UTC m=+2132.867976408" observedRunningTime="2026-03-13 21:03:23.11019667 +0000 UTC m=+2134.131312561" watchObservedRunningTime="2026-03-13 21:03:23.12008583 +0000 UTC m=+2134.141201721" Mar 13 21:03:27 crc kubenswrapper[4790]: I0313 21:03:27.588746 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:27 crc kubenswrapper[4790]: I0313 21:03:27.589095 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:28 crc kubenswrapper[4790]: I0313 21:03:28.646699 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2g4x4" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" probeResult="failure" output=< Mar 13 21:03:28 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:03:28 crc kubenswrapper[4790]: > Mar 13 21:03:37 crc kubenswrapper[4790]: I0313 21:03:37.650402 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:37 crc kubenswrapper[4790]: I0313 21:03:37.698665 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:37 crc kubenswrapper[4790]: I0313 21:03:37.891014 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.247744 4790 generic.go:334] "Generic (PLEG): container finished" podID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerID="e12be54f325e5426cb892a229deff495e48abdb2e1c4e0dad6fe222c62342b3b" exitCode=0 Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.247816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerDied","Data":"e12be54f325e5426cb892a229deff495e48abdb2e1c4e0dad6fe222c62342b3b"} Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.248371 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2g4x4" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" containerID="cri-o://5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" gracePeriod=2 Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.715091 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.843916 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"22852680-9cbb-4ca0-817d-d8391e019c99\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.844168 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"22852680-9cbb-4ca0-817d-d8391e019c99\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.844861 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"22852680-9cbb-4ca0-817d-d8391e019c99\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.845918 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities" (OuterVolumeSpecName: "utilities") pod "22852680-9cbb-4ca0-817d-d8391e019c99" (UID: "22852680-9cbb-4ca0-817d-d8391e019c99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.847340 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.854854 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb" (OuterVolumeSpecName: "kube-api-access-s9qkb") pod "22852680-9cbb-4ca0-817d-d8391e019c99" (UID: "22852680-9cbb-4ca0-817d-d8391e019c99"). InnerVolumeSpecName "kube-api-access-s9qkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.949978 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.976780 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22852680-9cbb-4ca0-817d-d8391e019c99" (UID: "22852680-9cbb-4ca0-817d-d8391e019c99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.051587 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262297 4790 generic.go:334] "Generic (PLEG): container finished" podID="22852680-9cbb-4ca0-817d-d8391e019c99" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" exitCode=0 Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262357 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262400 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56"} Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262704 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"ab4c725566fad1ec1f3ea2b58848dc912abb41b5c8e1dbefa92e0bd281a81a63"} Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262734 4790 scope.go:117] "RemoveContainer" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.294338 4790 scope.go:117] "RemoveContainer" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.307982 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.318876 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.331426 4790 scope.go:117] "RemoveContainer" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.369526 4790 scope.go:117] "RemoveContainer" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" Mar 13 21:03:40 crc kubenswrapper[4790]: E0313 21:03:40.370286 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56\": container with ID starting with 5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56 not found: ID does not exist" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.370346 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56"} err="failed to get container status \"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56\": rpc error: code = NotFound desc = could not find container \"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56\": container with ID starting with 5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56 not found: ID does not exist" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.370390 4790 scope.go:117] "RemoveContainer" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" Mar 13 21:03:40 crc kubenswrapper[4790]: E0313 21:03:40.371507 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f\": container with ID starting with abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f not found: ID does not exist" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.371533 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f"} err="failed to get container status \"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f\": rpc error: code = NotFound desc = could not find container \"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f\": container with ID starting with abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f not found: ID does not exist" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.371551 4790 scope.go:117] "RemoveContainer" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" Mar 13 21:03:40 crc kubenswrapper[4790]: E0313 21:03:40.372056 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84\": container with ID starting with 3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84 not found: ID does not exist" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.372082 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84"} err="failed to get container status \"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84\": rpc error: code = NotFound desc = could not find container \"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84\": container with ID starting with 3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84 not found: ID does not exist" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.646741 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765398 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765572 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765690 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.770145 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr" (OuterVolumeSpecName: "kube-api-access-2x9nr") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "kube-api-access-2x9nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.771198 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.798848 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory" (OuterVolumeSpecName: "inventory") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.801723 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.808536 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.809694 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.869937 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.869976 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.869990 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.870004 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.870017 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.870030 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.272106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerDied","Data":"0dafa60c26fe92544b98d93c0ab98afa5193df785bac54fff3b546f1a811fd97"} Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.273050 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dafa60c26fe92544b98d93c0ab98afa5193df785bac54fff3b546f1a811fd97" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.272119 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356230 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx"] Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356730 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-utilities" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356751 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-utilities" Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356784 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-content" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356794 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-content" Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356817 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356826 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356841 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356850 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.357079 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.357106 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.357909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.359817 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.359956 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.363624 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.363628 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.364085 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.365753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx"] Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479751 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.580858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.580973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.581025 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.581056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.581091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.584814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.585273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.586329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.590972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.597534 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.676474 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" path="/var/lib/kubelet/pods/22852680-9cbb-4ca0-817d-d8391e019c99/volumes" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.681827 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:42 crc kubenswrapper[4790]: I0313 21:03:42.188185 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx"] Mar 13 21:03:42 crc kubenswrapper[4790]: I0313 21:03:42.283311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerStarted","Data":"750819b8447adbdcf460745d4cc408a88dcc52443bc7524ebb6bbcda342e2ca3"} Mar 13 21:03:43 crc kubenswrapper[4790]: I0313 21:03:43.292863 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerStarted","Data":"2a4b7cacb6bb56397aa8e80bce91be52e687845b109945911184f15eb741cc40"} Mar 13 21:03:43 crc kubenswrapper[4790]: I0313 21:03:43.306790 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" podStartSLOduration=1.8644482 podStartE2EDuration="2.306766418s" podCreationTimestamp="2026-03-13 21:03:41 +0000 UTC" firstStartedPulling="2026-03-13 21:03:42.179675029 +0000 UTC m=+2153.200790920" lastFinishedPulling="2026-03-13 21:03:42.621993247 +0000 UTC m=+2153.643109138" observedRunningTime="2026-03-13 21:03:43.305620097 +0000 UTC m=+2154.326735988" watchObservedRunningTime="2026-03-13 21:03:43.306766418 +0000 UTC m=+2154.327882309" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.151661 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.153914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.157308 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.157360 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.157694 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.160361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.305019 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"auto-csr-approver-29557264-b5j85\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.406878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"auto-csr-approver-29557264-b5j85\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.430990 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"auto-csr-approver-29557264-b5j85\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.499066 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.919041 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:04:00 crc kubenswrapper[4790]: W0313 21:04:00.921658 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33811d20_0fb8_4b06_a9dd_d2488b19d7b9.slice/crio-5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06 WatchSource:0}: Error finding container 5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06: Status 404 returned error can't find the container with id 5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06 Mar 13 21:04:01 crc kubenswrapper[4790]: I0313 21:04:01.454358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557264-b5j85" event={"ID":"33811d20-0fb8-4b06-a9dd-d2488b19d7b9","Type":"ContainerStarted","Data":"5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06"} Mar 13 21:04:02 crc kubenswrapper[4790]: I0313 21:04:02.465321 4790 generic.go:334] "Generic (PLEG): container finished" podID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerID="3a443bd9f4b8d1df7af93baf309b6b85a45139407ed6e8e7a9df32fd174d2a54" exitCode=0 Mar 13 21:04:02 crc kubenswrapper[4790]: I0313 21:04:02.465411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557264-b5j85" event={"ID":"33811d20-0fb8-4b06-a9dd-d2488b19d7b9","Type":"ContainerDied","Data":"3a443bd9f4b8d1df7af93baf309b6b85a45139407ed6e8e7a9df32fd174d2a54"} Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.761991 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.778716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.783993 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5" (OuterVolumeSpecName: "kube-api-access-69tg5") pod "33811d20-0fb8-4b06-a9dd-d2488b19d7b9" (UID: "33811d20-0fb8-4b06-a9dd-d2488b19d7b9"). InnerVolumeSpecName "kube-api-access-69tg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.881365 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") on node \"crc\" DevicePath \"\"" Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.488579 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557264-b5j85" event={"ID":"33811d20-0fb8-4b06-a9dd-d2488b19d7b9","Type":"ContainerDied","Data":"5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06"} Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.488624 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06" Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.488654 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.827650 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.837331 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 21:04:05 crc kubenswrapper[4790]: I0313 21:04:05.672390 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" path="/var/lib/kubelet/pods/7706813b-e8e7-4b17-ba18-993c121eed66/volumes" Mar 13 21:04:56 crc kubenswrapper[4790]: I0313 21:04:56.942319 4790 scope.go:117] "RemoveContainer" containerID="98d6a341587e40eeb366a4b8a2eab51c3ea58fa67b5db767f9e2261febd34d64" Mar 13 21:05:02 crc kubenswrapper[4790]: I0313 21:05:02.997837 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:02 crc kubenswrapper[4790]: E0313 21:05:02.998903 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerName="oc" Mar 13 21:05:02 crc kubenswrapper[4790]: I0313 21:05:02.998921 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerName="oc" Mar 13 21:05:02 crc kubenswrapper[4790]: I0313 21:05:02.999149 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerName="oc" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.001033 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.009345 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.170176 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.170632 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.170687 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.294585 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.326467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.787876 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.055805 4790 generic.go:334] "Generic (PLEG): container finished" podID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" exitCode=0 Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.055865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1"} Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.055910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerStarted","Data":"f00d7030b9f77eb48a44e61af48736a85e89f6cd7470963d43883183ef36faa7"} Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.057419 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:05:05 crc kubenswrapper[4790]: I0313 21:05:05.067309 4790 generic.go:334] "Generic (PLEG): container finished" podID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" exitCode=0 Mar 13 21:05:05 crc kubenswrapper[4790]: I0313 21:05:05.067391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad"} Mar 13 21:05:06 crc kubenswrapper[4790]: I0313 21:05:06.080272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerStarted","Data":"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35"} Mar 13 21:05:06 crc kubenswrapper[4790]: I0313 21:05:06.101480 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-88dvw" podStartSLOduration=2.683805465 podStartE2EDuration="4.10146327s" podCreationTimestamp="2026-03-13 21:05:02 +0000 UTC" firstStartedPulling="2026-03-13 21:05:04.057181521 +0000 UTC m=+2235.078297412" lastFinishedPulling="2026-03-13 21:05:05.474839326 +0000 UTC m=+2236.495955217" observedRunningTime="2026-03-13 21:05:06.096891085 +0000 UTC m=+2237.118006986" watchObservedRunningTime="2026-03-13 21:05:06.10146327 +0000 UTC m=+2237.122579161" Mar 13 21:05:13 crc kubenswrapper[4790]: I0313 21:05:13.326729 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:13 crc kubenswrapper[4790]: I0313 21:05:13.327288 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:13 crc kubenswrapper[4790]: I0313 21:05:13.381438 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.015338 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.015413 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.213021 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.273562 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.192610 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-88dvw" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" containerID="cri-o://10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" gracePeriod=2 Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.646766 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821207 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821565 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821903 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities" (OuterVolumeSpecName: "utilities") pod "de983e6c-4ce2-42f6-94ed-44a141b2b39d" (UID: "de983e6c-4ce2-42f6-94ed-44a141b2b39d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.822128 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.827458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7" (OuterVolumeSpecName: "kube-api-access-mpzc7") pod "de983e6c-4ce2-42f6-94ed-44a141b2b39d" (UID: "de983e6c-4ce2-42f6-94ed-44a141b2b39d"). InnerVolumeSpecName "kube-api-access-mpzc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.845649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de983e6c-4ce2-42f6-94ed-44a141b2b39d" (UID: "de983e6c-4ce2-42f6-94ed-44a141b2b39d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.924281 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") on node \"crc\" DevicePath \"\"" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.924319 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204180 4790 generic.go:334] "Generic (PLEG): container finished" podID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" exitCode=0 Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35"} Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"f00d7030b9f77eb48a44e61af48736a85e89f6cd7470963d43883183ef36faa7"} Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204318 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204353 4790 scope.go:117] "RemoveContainer" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.244791 4790 scope.go:117] "RemoveContainer" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.251930 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.261409 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.278675 4790 scope.go:117] "RemoveContainer" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.315935 4790 scope.go:117] "RemoveContainer" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" Mar 13 21:05:17 crc kubenswrapper[4790]: E0313 21:05:17.316650 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35\": container with ID starting with 10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35 not found: ID does not exist" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.316684 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35"} err="failed to get container status \"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35\": rpc error: code = NotFound desc = could not find container \"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35\": container with ID starting with 10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35 not found: ID does not exist" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.316709 4790 scope.go:117] "RemoveContainer" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" Mar 13 21:05:17 crc kubenswrapper[4790]: E0313 21:05:17.316965 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad\": container with ID starting with 89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad not found: ID does not exist" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.316995 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad"} err="failed to get container status \"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad\": rpc error: code = NotFound desc = could not find container \"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad\": container with ID starting with 89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad not found: ID does not exist" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.317010 4790 scope.go:117] "RemoveContainer" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" Mar 13 21:05:17 crc kubenswrapper[4790]: E0313 21:05:17.317436 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1\": container with ID starting with 538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1 not found: ID does not exist" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.317464 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1"} err="failed to get container status \"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1\": rpc error: code = NotFound desc = could not find container \"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1\": container with ID starting with 538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1 not found: ID does not exist" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.670164 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" path="/var/lib/kubelet/pods/de983e6c-4ce2-42f6-94ed-44a141b2b39d/volumes" Mar 13 21:05:44 crc kubenswrapper[4790]: I0313 21:05:44.016418 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:05:44 crc kubenswrapper[4790]: I0313 21:05:44.018576 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.148582 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:06:00 crc kubenswrapper[4790]: E0313 21:06:00.149439 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-utilities" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149450 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-utilities" Mar 13 21:06:00 crc kubenswrapper[4790]: E0313 21:06:00.149464 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-content" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149470 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-content" Mar 13 21:06:00 crc kubenswrapper[4790]: E0313 21:06:00.149493 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149499 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149731 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.150364 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.153037 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.153172 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.153275 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.158693 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.336804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"auto-csr-approver-29557266-4sbr6\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.438614 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"auto-csr-approver-29557266-4sbr6\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.462336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"auto-csr-approver-29557266-4sbr6\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.508752 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.939093 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:06:01 crc kubenswrapper[4790]: I0313 21:06:01.646448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" event={"ID":"6a921d70-847d-4a96-ad9a-18438299237e","Type":"ContainerStarted","Data":"859b08a740a73b3b98ee9d8f5c5d1673cf407fb993f939365f89333530343cb2"} Mar 13 21:06:02 crc kubenswrapper[4790]: I0313 21:06:02.657634 4790 generic.go:334] "Generic (PLEG): container finished" podID="6a921d70-847d-4a96-ad9a-18438299237e" containerID="3e0c0f63bb37da5c2b233a3b4a5d7ae121b4ed58aa4773dd2ed0d98e00fff307" exitCode=0 Mar 13 21:06:02 crc kubenswrapper[4790]: I0313 21:06:02.657680 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" event={"ID":"6a921d70-847d-4a96-ad9a-18438299237e","Type":"ContainerDied","Data":"3e0c0f63bb37da5c2b233a3b4a5d7ae121b4ed58aa4773dd2ed0d98e00fff307"} Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.001836 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.039633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"6a921d70-847d-4a96-ad9a-18438299237e\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.045822 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql" (OuterVolumeSpecName: "kube-api-access-dgtql") pod "6a921d70-847d-4a96-ad9a-18438299237e" (UID: "6a921d70-847d-4a96-ad9a-18438299237e"). InnerVolumeSpecName "kube-api-access-dgtql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.143080 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.679419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" event={"ID":"6a921d70-847d-4a96-ad9a-18438299237e","Type":"ContainerDied","Data":"859b08a740a73b3b98ee9d8f5c5d1673cf407fb993f939365f89333530343cb2"} Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.679461 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="859b08a740a73b3b98ee9d8f5c5d1673cf407fb993f939365f89333530343cb2" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.679525 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:05 crc kubenswrapper[4790]: I0313 21:06:05.074829 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:06:05 crc kubenswrapper[4790]: I0313 21:06:05.098692 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:06:05 crc kubenswrapper[4790]: I0313 21:06:05.670814 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" path="/var/lib/kubelet/pods/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0/volumes" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.729674 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:08 crc kubenswrapper[4790]: E0313 21:06:08.730329 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a921d70-847d-4a96-ad9a-18438299237e" containerName="oc" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.730342 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a921d70-847d-4a96-ad9a-18438299237e" containerName="oc" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.730548 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a921d70-847d-4a96-ad9a-18438299237e" containerName="oc" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.731978 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.746790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.831818 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.832279 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.832437 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.934957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.967660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:09 crc kubenswrapper[4790]: I0313 21:06:09.053965 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:09 crc kubenswrapper[4790]: I0313 21:06:09.605917 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:09 crc kubenswrapper[4790]: I0313 21:06:09.744781 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerStarted","Data":"b899db2cbc3a23de289484d34e35002cd8a84c89f220f29fe04a8a0a11619bb5"} Mar 13 21:06:10 crc kubenswrapper[4790]: I0313 21:06:10.755279 4790 generic.go:334] "Generic (PLEG): container finished" podID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" exitCode=0 Mar 13 21:06:10 crc kubenswrapper[4790]: I0313 21:06:10.755330 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0"} Mar 13 21:06:11 crc kubenswrapper[4790]: I0313 21:06:11.767767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerStarted","Data":"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a"} Mar 13 21:06:12 crc kubenswrapper[4790]: I0313 21:06:12.779833 4790 generic.go:334] "Generic (PLEG): container finished" podID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" exitCode=0 Mar 13 21:06:12 crc kubenswrapper[4790]: I0313 21:06:12.779924 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a"} Mar 13 21:06:13 crc kubenswrapper[4790]: I0313 21:06:13.792835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerStarted","Data":"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6"} Mar 13 21:06:13 crc kubenswrapper[4790]: I0313 21:06:13.823502 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q7hwz" podStartSLOduration=3.375850526 podStartE2EDuration="5.823485297s" podCreationTimestamp="2026-03-13 21:06:08 +0000 UTC" firstStartedPulling="2026-03-13 21:06:10.756951744 +0000 UTC m=+2301.778067635" lastFinishedPulling="2026-03-13 21:06:13.204586515 +0000 UTC m=+2304.225702406" observedRunningTime="2026-03-13 21:06:13.81015065 +0000 UTC m=+2304.831266541" watchObservedRunningTime="2026-03-13 21:06:13.823485297 +0000 UTC m=+2304.844601188" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.016076 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.016167 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.016232 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.017368 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.017511 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" gracePeriod=600 Mar 13 21:06:14 crc kubenswrapper[4790]: E0313 21:06:14.153100 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.805470 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" exitCode=0 Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.805520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5"} Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.805869 4790 scope.go:117] "RemoveContainer" containerID="a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.806548 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:14 crc kubenswrapper[4790]: E0313 21:06:14.806802 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.055111 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.056712 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.099776 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.890104 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.943015 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:21 crc kubenswrapper[4790]: I0313 21:06:21.869977 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q7hwz" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" containerID="cri-o://5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" gracePeriod=2 Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.335029 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.490248 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.490356 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.490509 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.491321 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities" (OuterVolumeSpecName: "utilities") pod "07a0994d-f139-49d8-8d95-2b6ca52a0b84" (UID: "07a0994d-f139-49d8-8d95-2b6ca52a0b84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.496815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd" (OuterVolumeSpecName: "kube-api-access-xwqpd") pod "07a0994d-f139-49d8-8d95-2b6ca52a0b84" (UID: "07a0994d-f139-49d8-8d95-2b6ca52a0b84"). InnerVolumeSpecName "kube-api-access-xwqpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.592652 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.592915 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.880085 4790 generic.go:334] "Generic (PLEG): container finished" podID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" exitCode=0 Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.880158 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.881360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6"} Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.881629 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"b899db2cbc3a23de289484d34e35002cd8a84c89f220f29fe04a8a0a11619bb5"} Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.881690 4790 scope.go:117] "RemoveContainer" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.900599 4790 scope.go:117] "RemoveContainer" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.920627 4790 scope.go:117] "RemoveContainer" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.961201 4790 scope.go:117] "RemoveContainer" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" Mar 13 21:06:22 crc kubenswrapper[4790]: E0313 21:06:22.961678 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6\": container with ID starting with 5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6 not found: ID does not exist" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.961710 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6"} err="failed to get container status \"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6\": rpc error: code = NotFound desc = could not find container \"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6\": container with ID starting with 5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6 not found: ID does not exist" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.961730 4790 scope.go:117] "RemoveContainer" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" Mar 13 21:06:22 crc kubenswrapper[4790]: E0313 21:06:22.962077 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a\": container with ID starting with a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a not found: ID does not exist" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.962101 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a"} err="failed to get container status \"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a\": rpc error: code = NotFound desc = could not find container \"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a\": container with ID starting with a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a not found: ID does not exist" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.962116 4790 scope.go:117] "RemoveContainer" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" Mar 13 21:06:22 crc kubenswrapper[4790]: E0313 21:06:22.962545 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0\": container with ID starting with 6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0 not found: ID does not exist" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.962578 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0"} err="failed to get container status \"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0\": rpc error: code = NotFound desc = could not find container \"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0\": container with ID starting with 6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0 not found: ID does not exist" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.964734 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07a0994d-f139-49d8-8d95-2b6ca52a0b84" (UID: "07a0994d-f139-49d8-8d95-2b6ca52a0b84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.000311 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.235634 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.243895 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.671108 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" path="/var/lib/kubelet/pods/07a0994d-f139-49d8-8d95-2b6ca52a0b84/volumes" Mar 13 21:06:25 crc kubenswrapper[4790]: I0313 21:06:25.660782 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:25 crc kubenswrapper[4790]: E0313 21:06:25.661631 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:38 crc kubenswrapper[4790]: I0313 21:06:38.659908 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:38 crc kubenswrapper[4790]: E0313 21:06:38.660582 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:50 crc kubenswrapper[4790]: I0313 21:06:50.660579 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:50 crc kubenswrapper[4790]: E0313 21:06:50.661304 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:57 crc kubenswrapper[4790]: I0313 21:06:57.057634 4790 scope.go:117] "RemoveContainer" containerID="5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1" Mar 13 21:07:04 crc kubenswrapper[4790]: I0313 21:07:04.660451 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:04 crc kubenswrapper[4790]: E0313 21:07:04.661370 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:16 crc kubenswrapper[4790]: I0313 21:07:16.660163 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:16 crc kubenswrapper[4790]: E0313 21:07:16.661059 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:24 crc kubenswrapper[4790]: I0313 21:07:24.413925 4790 generic.go:334] "Generic (PLEG): container finished" podID="c70cf667-ebdd-414d-be40-62d26209abcf" containerID="2a4b7cacb6bb56397aa8e80bce91be52e687845b109945911184f15eb741cc40" exitCode=0 Mar 13 21:07:24 crc kubenswrapper[4790]: I0313 21:07:24.413990 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerDied","Data":"2a4b7cacb6bb56397aa8e80bce91be52e687845b109945911184f15eb741cc40"} Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.882615 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915318 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915438 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915497 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915558 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915584 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.921096 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl" (OuterVolumeSpecName: "kube-api-access-sd6kl") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "kube-api-access-sd6kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.921652 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.943028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.943411 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.946090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory" (OuterVolumeSpecName: "inventory") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018190 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018234 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018257 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018269 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018284 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.441122 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerDied","Data":"750819b8447adbdcf460745d4cc408a88dcc52443bc7524ebb6bbcda342e2ca3"} Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.441169 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750819b8447adbdcf460745d4cc408a88dcc52443bc7524ebb6bbcda342e2ca3" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.441174 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.536011 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv"] Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.538445 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.538821 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.538959 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-utilities" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539042 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-utilities" Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.539118 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70cf667-ebdd-414d-be40-62d26209abcf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539196 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70cf667-ebdd-414d-be40-62d26209abcf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.539276 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-content" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539348 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-content" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539717 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70cf667-ebdd-414d-be40-62d26209abcf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.540699 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.544238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.544588 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.544841 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545510 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545811 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545994 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.547488 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv"] Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629656 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629851 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732502 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732573 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732599 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732644 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.734033 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.736045 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.736812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737579 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737789 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.738930 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.739006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.739106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.749738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.863341 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:27 crc kubenswrapper[4790]: I0313 21:07:27.361008 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv"] Mar 13 21:07:27 crc kubenswrapper[4790]: I0313 21:07:27.451912 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerStarted","Data":"5e955eaeb50a09aee8260c883bbe8ca2342e2062895d307915c8671ae2b82195"} Mar 13 21:07:27 crc kubenswrapper[4790]: I0313 21:07:27.659840 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:27 crc kubenswrapper[4790]: E0313 21:07:27.660264 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:28 crc kubenswrapper[4790]: I0313 21:07:28.460048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerStarted","Data":"e137a0d045ac7d76f4b1c0eb61bbf159c0b0eb2b656ff90d4306bffd4b4bb7f4"} Mar 13 21:07:28 crc kubenswrapper[4790]: I0313 21:07:28.482240 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" podStartSLOduration=2.037676662 podStartE2EDuration="2.482214065s" podCreationTimestamp="2026-03-13 21:07:26 +0000 UTC" firstStartedPulling="2026-03-13 21:07:27.357646321 +0000 UTC m=+2378.378762212" lastFinishedPulling="2026-03-13 21:07:27.802183724 +0000 UTC m=+2378.823299615" observedRunningTime="2026-03-13 21:07:28.47691398 +0000 UTC m=+2379.498029871" watchObservedRunningTime="2026-03-13 21:07:28.482214065 +0000 UTC m=+2379.503329956" Mar 13 21:07:41 crc kubenswrapper[4790]: I0313 21:07:41.660433 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:41 crc kubenswrapper[4790]: E0313 21:07:41.662714 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:56 crc kubenswrapper[4790]: I0313 21:07:56.660145 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:56 crc kubenswrapper[4790]: E0313 21:07:56.661058 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.145232 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.147096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.162247 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.165031 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.165864 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.166099 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.224886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"auto-csr-approver-29557268-fbrfb\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.327504 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"auto-csr-approver-29557268-fbrfb\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.346604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"auto-csr-approver-29557268-fbrfb\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.471520 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.966779 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:08:00 crc kubenswrapper[4790]: W0313 21:08:00.975703 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f962ddd_b18b_43c7_81e1_7eda48d64d88.slice/crio-c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc WatchSource:0}: Error finding container c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc: Status 404 returned error can't find the container with id c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc Mar 13 21:08:01 crc kubenswrapper[4790]: I0313 21:08:01.731298 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" event={"ID":"5f962ddd-b18b-43c7-81e1-7eda48d64d88","Type":"ContainerStarted","Data":"c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc"} Mar 13 21:08:02 crc kubenswrapper[4790]: I0313 21:08:02.742608 4790 generic.go:334] "Generic (PLEG): container finished" podID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerID="bdca2f8da697e12973555a54d7d0753abfb943fd0d2919dd4adb4178a3e9c052" exitCode=0 Mar 13 21:08:02 crc kubenswrapper[4790]: I0313 21:08:02.742673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" event={"ID":"5f962ddd-b18b-43c7-81e1-7eda48d64d88","Type":"ContainerDied","Data":"bdca2f8da697e12973555a54d7d0753abfb943fd0d2919dd4adb4178a3e9c052"} Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.046838 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.204235 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.211047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx" (OuterVolumeSpecName: "kube-api-access-qsrpx") pod "5f962ddd-b18b-43c7-81e1-7eda48d64d88" (UID: "5f962ddd-b18b-43c7-81e1-7eda48d64d88"). InnerVolumeSpecName "kube-api-access-qsrpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.306941 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") on node \"crc\" DevicePath \"\"" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.760233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" event={"ID":"5f962ddd-b18b-43c7-81e1-7eda48d64d88","Type":"ContainerDied","Data":"c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc"} Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.760743 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.760294 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:05 crc kubenswrapper[4790]: I0313 21:08:05.117282 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:08:05 crc kubenswrapper[4790]: I0313 21:08:05.128360 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:08:05 crc kubenswrapper[4790]: I0313 21:08:05.670889 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" path="/var/lib/kubelet/pods/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0/volumes" Mar 13 21:08:11 crc kubenswrapper[4790]: I0313 21:08:11.660467 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:11 crc kubenswrapper[4790]: E0313 21:08:11.661371 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:25 crc kubenswrapper[4790]: I0313 21:08:25.660280 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:25 crc kubenswrapper[4790]: E0313 21:08:25.661168 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:38 crc kubenswrapper[4790]: I0313 21:08:38.659575 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:38 crc kubenswrapper[4790]: E0313 21:08:38.660419 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:50 crc kubenswrapper[4790]: I0313 21:08:50.660504 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:50 crc kubenswrapper[4790]: E0313 21:08:50.661504 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:57 crc kubenswrapper[4790]: I0313 21:08:57.167618 4790 scope.go:117] "RemoveContainer" containerID="c9fc9237e156eb0becb6b2dc2279bf5dc16eec046e67e33454f05890e75163e2" Mar 13 21:09:01 crc kubenswrapper[4790]: I0313 21:09:01.660114 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:01 crc kubenswrapper[4790]: E0313 21:09:01.660942 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:15 crc kubenswrapper[4790]: I0313 21:09:15.660757 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:15 crc kubenswrapper[4790]: E0313 21:09:15.661564 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:28 crc kubenswrapper[4790]: I0313 21:09:28.659693 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:28 crc kubenswrapper[4790]: E0313 21:09:28.661558 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.224511 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:33 crc kubenswrapper[4790]: E0313 21:09:33.228853 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerName="oc" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.228910 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerName="oc" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.229884 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerName="oc" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.242067 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.253892 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.337874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.337945 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.338107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.439679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.439830 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.439863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.440201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.440365 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.467195 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.578736 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.181973 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.539099 4790 generic.go:334] "Generic (PLEG): container finished" podID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" exitCode=0 Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.540604 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579"} Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.540900 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerStarted","Data":"f098f8de58f4430eacc872f8239f13bde3881a4b8d296b404f354b27ab3de96c"} Mar 13 21:09:36 crc kubenswrapper[4790]: I0313 21:09:36.560907 4790 generic.go:334] "Generic (PLEG): container finished" podID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" exitCode=0 Mar 13 21:09:36 crc kubenswrapper[4790]: I0313 21:09:36.560961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec"} Mar 13 21:09:37 crc kubenswrapper[4790]: I0313 21:09:37.571596 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerStarted","Data":"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f"} Mar 13 21:09:37 crc kubenswrapper[4790]: I0313 21:09:37.599014 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-692s5" podStartSLOduration=2.023030629 podStartE2EDuration="4.598990456s" podCreationTimestamp="2026-03-13 21:09:33 +0000 UTC" firstStartedPulling="2026-03-13 21:09:34.544180948 +0000 UTC m=+2505.565296839" lastFinishedPulling="2026-03-13 21:09:37.120140785 +0000 UTC m=+2508.141256666" observedRunningTime="2026-03-13 21:09:37.586181994 +0000 UTC m=+2508.607297895" watchObservedRunningTime="2026-03-13 21:09:37.598990456 +0000 UTC m=+2508.620106347" Mar 13 21:09:39 crc kubenswrapper[4790]: I0313 21:09:39.590695 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerID="e137a0d045ac7d76f4b1c0eb61bbf159c0b0eb2b656ff90d4306bffd4b4bb7f4" exitCode=0 Mar 13 21:09:39 crc kubenswrapper[4790]: I0313 21:09:39.591472 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerDied","Data":"e137a0d045ac7d76f4b1c0eb61bbf159c0b0eb2b656ff90d4306bffd4b4bb7f4"} Mar 13 21:09:39 crc kubenswrapper[4790]: I0313 21:09:39.670503 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:39 crc kubenswrapper[4790]: E0313 21:09:39.670867 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:40 crc kubenswrapper[4790]: I0313 21:09:40.995924 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.085815 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.085897 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.085934 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086010 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086062 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086209 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086261 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086340 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.092041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg" (OuterVolumeSpecName: "kube-api-access-44jvg") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "kube-api-access-44jvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.108752 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.118597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.121806 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.132077 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.133544 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.137841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.140183 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.141841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.143124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.161437 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory" (OuterVolumeSpecName: "inventory") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189212 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189252 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189261 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189271 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189281 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189290 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189301 4790 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189312 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189322 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189333 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189345 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.611179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerDied","Data":"5e955eaeb50a09aee8260c883bbe8ca2342e2062895d307915c8671ae2b82195"} Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.611573 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e955eaeb50a09aee8260c883bbe8ca2342e2062895d307915c8671ae2b82195" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.611245 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.710525 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr"] Mar 13 21:09:41 crc kubenswrapper[4790]: E0313 21:09:41.711161 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.711189 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.711452 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.712144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716113 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716269 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716278 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716285 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716368 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.723388 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr"] Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801044 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801518 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801790 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.802016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.802132 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904676 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904853 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904911 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908545 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908640 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.909517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.910182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.923154 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:42 crc kubenswrapper[4790]: I0313 21:09:42.106736 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:42 crc kubenswrapper[4790]: I0313 21:09:42.616492 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr"] Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.579352 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.579661 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.629667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerStarted","Data":"fbd28e9d93d15c499d0b1595969d50c3452959738f825ad9e7b6e4609348ae9c"} Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.629736 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerStarted","Data":"1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95"} Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.635429 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.656669 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" podStartSLOduration=2.24259048 podStartE2EDuration="2.656640431s" podCreationTimestamp="2026-03-13 21:09:41 +0000 UTC" firstStartedPulling="2026-03-13 21:09:42.618344224 +0000 UTC m=+2513.639460115" lastFinishedPulling="2026-03-13 21:09:43.032394175 +0000 UTC m=+2514.053510066" observedRunningTime="2026-03-13 21:09:43.647096072 +0000 UTC m=+2514.668212003" watchObservedRunningTime="2026-03-13 21:09:43.656640431 +0000 UTC m=+2514.677756342" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.693680 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.896863 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:45 crc kubenswrapper[4790]: I0313 21:09:45.642752 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-692s5" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" containerID="cri-o://3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" gracePeriod=2 Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.113288 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.291919 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"e70f4ff5-2cd5-4915-978d-dfb989d52730\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.291997 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"e70f4ff5-2cd5-4915-978d-dfb989d52730\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.292133 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"e70f4ff5-2cd5-4915-978d-dfb989d52730\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.292913 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities" (OuterVolumeSpecName: "utilities") pod "e70f4ff5-2cd5-4915-978d-dfb989d52730" (UID: "e70f4ff5-2cd5-4915-978d-dfb989d52730"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.293733 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.303731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl" (OuterVolumeSpecName: "kube-api-access-9qzdl") pod "e70f4ff5-2cd5-4915-978d-dfb989d52730" (UID: "e70f4ff5-2cd5-4915-978d-dfb989d52730"). InnerVolumeSpecName "kube-api-access-9qzdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.395954 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652504 4790 generic.go:334] "Generic (PLEG): container finished" podID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" exitCode=0 Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f"} Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"f098f8de58f4430eacc872f8239f13bde3881a4b8d296b404f354b27ab3de96c"} Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652584 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652606 4790 scope.go:117] "RemoveContainer" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.676041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e70f4ff5-2cd5-4915-978d-dfb989d52730" (UID: "e70f4ff5-2cd5-4915-978d-dfb989d52730"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.677823 4790 scope.go:117] "RemoveContainer" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.699176 4790 scope.go:117] "RemoveContainer" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.700802 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.742600 4790 scope.go:117] "RemoveContainer" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" Mar 13 21:09:46 crc kubenswrapper[4790]: E0313 21:09:46.743013 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f\": container with ID starting with 3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f not found: ID does not exist" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743042 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f"} err="failed to get container status \"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f\": rpc error: code = NotFound desc = could not find container \"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f\": container with ID starting with 3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f not found: ID does not exist" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743064 4790 scope.go:117] "RemoveContainer" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" Mar 13 21:09:46 crc kubenswrapper[4790]: E0313 21:09:46.743506 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec\": container with ID starting with dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec not found: ID does not exist" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743553 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec"} err="failed to get container status \"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec\": rpc error: code = NotFound desc = could not find container \"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec\": container with ID starting with dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec not found: ID does not exist" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743589 4790 scope.go:117] "RemoveContainer" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" Mar 13 21:09:46 crc kubenswrapper[4790]: E0313 21:09:46.743980 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579\": container with ID starting with 54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579 not found: ID does not exist" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.744009 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579"} err="failed to get container status \"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579\": rpc error: code = NotFound desc = could not find container \"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579\": container with ID starting with 54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579 not found: ID does not exist" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.988986 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.997822 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:47 crc kubenswrapper[4790]: I0313 21:09:47.675956 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" path="/var/lib/kubelet/pods/e70f4ff5-2cd5-4915-978d-dfb989d52730/volumes" Mar 13 21:09:53 crc kubenswrapper[4790]: I0313 21:09:53.661302 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:53 crc kubenswrapper[4790]: E0313 21:09:53.663175 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.145283 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:10:00 crc kubenswrapper[4790]: E0313 21:10:00.146497 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146516 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" Mar 13 21:10:00 crc kubenswrapper[4790]: E0313 21:10:00.146543 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-utilities" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146552 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-utilities" Mar 13 21:10:00 crc kubenswrapper[4790]: E0313 21:10:00.146572 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-content" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146582 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-content" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146837 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.147833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.150121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"auto-csr-approver-29557270-xndpr\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.153183 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.153370 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.153548 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.158597 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.251725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"auto-csr-approver-29557270-xndpr\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.271334 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"auto-csr-approver-29557270-xndpr\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.470096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:01 crc kubenswrapper[4790]: I0313 21:10:00.890508 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:10:01 crc kubenswrapper[4790]: W0313 21:10:00.893833 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68f751f6_8e31_448a_99e9_bf7f290684be.slice/crio-d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01 WatchSource:0}: Error finding container d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01: Status 404 returned error can't find the container with id d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01 Mar 13 21:10:01 crc kubenswrapper[4790]: I0313 21:10:01.797422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557270-xndpr" event={"ID":"68f751f6-8e31-448a-99e9-bf7f290684be","Type":"ContainerStarted","Data":"d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01"} Mar 13 21:10:02 crc kubenswrapper[4790]: I0313 21:10:02.806812 4790 generic.go:334] "Generic (PLEG): container finished" podID="68f751f6-8e31-448a-99e9-bf7f290684be" containerID="231c0b730759ce0ec6fa00fad0e521d17888055794c0179d0b2c116cf68aaf15" exitCode=0 Mar 13 21:10:02 crc kubenswrapper[4790]: I0313 21:10:02.806899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557270-xndpr" event={"ID":"68f751f6-8e31-448a-99e9-bf7f290684be","Type":"ContainerDied","Data":"231c0b730759ce0ec6fa00fad0e521d17888055794c0179d0b2c116cf68aaf15"} Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.128605 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.326041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"68f751f6-8e31-448a-99e9-bf7f290684be\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.333045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr" (OuterVolumeSpecName: "kube-api-access-r7ftr") pod "68f751f6-8e31-448a-99e9-bf7f290684be" (UID: "68f751f6-8e31-448a-99e9-bf7f290684be"). InnerVolumeSpecName "kube-api-access-r7ftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.428168 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") on node \"crc\" DevicePath \"\"" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.825405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557270-xndpr" event={"ID":"68f751f6-8e31-448a-99e9-bf7f290684be","Type":"ContainerDied","Data":"d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01"} Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.825454 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.825519 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:05 crc kubenswrapper[4790]: I0313 21:10:05.185405 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:10:05 crc kubenswrapper[4790]: I0313 21:10:05.193836 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:10:05 crc kubenswrapper[4790]: I0313 21:10:05.679081 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" path="/var/lib/kubelet/pods/33811d20-0fb8-4b06-a9dd-d2488b19d7b9/volumes" Mar 13 21:10:06 crc kubenswrapper[4790]: I0313 21:10:06.660047 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:06 crc kubenswrapper[4790]: E0313 21:10:06.660877 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:20 crc kubenswrapper[4790]: I0313 21:10:20.660665 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:20 crc kubenswrapper[4790]: E0313 21:10:20.661616 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:33 crc kubenswrapper[4790]: I0313 21:10:33.660777 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:33 crc kubenswrapper[4790]: E0313 21:10:33.661618 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:45 crc kubenswrapper[4790]: I0313 21:10:45.661340 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:45 crc kubenswrapper[4790]: E0313 21:10:45.662075 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:57 crc kubenswrapper[4790]: I0313 21:10:57.294363 4790 scope.go:117] "RemoveContainer" containerID="3a443bd9f4b8d1df7af93baf309b6b85a45139407ed6e8e7a9df32fd174d2a54" Mar 13 21:10:58 crc kubenswrapper[4790]: I0313 21:10:58.660424 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:58 crc kubenswrapper[4790]: E0313 21:10:58.661209 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:11:10 crc kubenswrapper[4790]: I0313 21:11:10.660140 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:11:10 crc kubenswrapper[4790]: E0313 21:11:10.661824 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:11:25 crc kubenswrapper[4790]: I0313 21:11:25.660304 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:11:26 crc kubenswrapper[4790]: I0313 21:11:26.561328 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795"} Mar 13 21:11:48 crc kubenswrapper[4790]: I0313 21:11:48.760988 4790 generic.go:334] "Generic (PLEG): container finished" podID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerID="fbd28e9d93d15c499d0b1595969d50c3452959738f825ad9e7b6e4609348ae9c" exitCode=0 Mar 13 21:11:48 crc kubenswrapper[4790]: I0313 21:11:48.761097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerDied","Data":"fbd28e9d93d15c499d0b1595969d50c3452959738f825ad9e7b6e4609348ae9c"} Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.168387 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276661 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276727 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276752 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276775 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276845 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276909 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.293811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw" (OuterVolumeSpecName: "kube-api-access-wd6pw") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "kube-api-access-wd6pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.293867 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.306021 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.307360 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.312467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory" (OuterVolumeSpecName: "inventory") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.316305 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.316673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405025 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405074 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405130 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405146 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405159 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405178 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405192 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.780079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerDied","Data":"1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95"} Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.780123 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.780190 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:11:50 crc kubenswrapper[4790]: E0313 21:11:50.934513 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b17a66_faf5_4379_ace9_a4fff12cac5b.slice/crio-1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95\": RecentStats: unable to find data in memory cache]" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.148307 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:12:00 crc kubenswrapper[4790]: E0313 21:12:00.149186 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149202 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 21:12:00 crc kubenswrapper[4790]: E0313 21:12:00.149243 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" containerName="oc" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149249 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" containerName="oc" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149434 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" containerName="oc" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149449 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.150048 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.152321 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.152729 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.152791 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.156802 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.279450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"auto-csr-approver-29557272-cdqgf\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.380846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"auto-csr-approver-29557272-cdqgf\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.403019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"auto-csr-approver-29557272-cdqgf\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.467305 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.897127 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.902665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.960523 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" event={"ID":"7b51130e-f39a-4a4c-b41e-c865e51004dd","Type":"ContainerStarted","Data":"1fa852d5331967e3d23cf1b3419e4624321eec271574a6ac797f9a29e8389d08"} Mar 13 21:12:02 crc kubenswrapper[4790]: I0313 21:12:02.978982 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerID="4ea25a336829635b84ca0d8e478c73129cc595166d50214c193658a79404456f" exitCode=0 Mar 13 21:12:02 crc kubenswrapper[4790]: I0313 21:12:02.979030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" event={"ID":"7b51130e-f39a-4a4c-b41e-c865e51004dd","Type":"ContainerDied","Data":"4ea25a336829635b84ca0d8e478c73129cc595166d50214c193658a79404456f"} Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.305969 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.452499 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"7b51130e-f39a-4a4c-b41e-c865e51004dd\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.458021 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq" (OuterVolumeSpecName: "kube-api-access-qdcpq") pod "7b51130e-f39a-4a4c-b41e-c865e51004dd" (UID: "7b51130e-f39a-4a4c-b41e-c865e51004dd"). InnerVolumeSpecName "kube-api-access-qdcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.555328 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") on node \"crc\" DevicePath \"\"" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.996196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" event={"ID":"7b51130e-f39a-4a4c-b41e-c865e51004dd","Type":"ContainerDied","Data":"1fa852d5331967e3d23cf1b3419e4624321eec271574a6ac797f9a29e8389d08"} Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.996237 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.996239 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa852d5331967e3d23cf1b3419e4624321eec271574a6ac797f9a29e8389d08" Mar 13 21:12:05 crc kubenswrapper[4790]: I0313 21:12:05.370893 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:12:05 crc kubenswrapper[4790]: I0313 21:12:05.380002 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:12:05 crc kubenswrapper[4790]: I0313 21:12:05.672047 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a921d70-847d-4a96-ad9a-18438299237e" path="/var/lib/kubelet/pods/6a921d70-847d-4a96-ad9a-18438299237e/volumes" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.878031 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 21:12:42 crc kubenswrapper[4790]: E0313 21:12:42.879774 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerName="oc" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.879800 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerName="oc" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.880070 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerName="oc" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.880910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.883179 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.883197 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.883951 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.884017 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xndhm" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.895547 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992181 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992275 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992456 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992577 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992710 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992731 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.094976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095032 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095092 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095340 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095573 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.096333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.096448 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.096687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.102840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.103010 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.104508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.116534 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.126845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.248568 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.681855 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 21:12:44 crc kubenswrapper[4790]: I0313 21:12:44.360188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerStarted","Data":"9ae048789cc06b95d8d9a690f59586791ebaca094ac82840d0dd227be9680876"} Mar 13 21:12:57 crc kubenswrapper[4790]: I0313 21:12:57.391363 4790 scope.go:117] "RemoveContainer" containerID="3e0c0f63bb37da5c2b233a3b4a5d7ae121b4ed58aa4773dd2ed0d98e00fff307" Mar 13 21:13:18 crc kubenswrapper[4790]: E0313 21:13:18.862446 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 21:13:18 crc kubenswrapper[4790]: E0313 21:13:18.863028 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbzlr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(50c1f858-4451-4e6e-9e80-6e37528305a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 21:13:18 crc kubenswrapper[4790]: E0313 21:13:18.864180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" Mar 13 21:13:19 crc kubenswrapper[4790]: E0313 21:13:19.718018 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" Mar 13 21:13:31 crc kubenswrapper[4790]: I0313 21:13:31.049441 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 21:13:32 crc kubenswrapper[4790]: I0313 21:13:32.830175 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerStarted","Data":"5f3dc8212dd652060ecb9c9d45ce324d2168353ccf633608e4415a58fb8949f8"} Mar 13 21:13:32 crc kubenswrapper[4790]: I0313 21:13:32.853223 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.491328204 podStartE2EDuration="51.853206943s" podCreationTimestamp="2026-03-13 21:12:41 +0000 UTC" firstStartedPulling="2026-03-13 21:12:43.684009242 +0000 UTC m=+2694.705125153" lastFinishedPulling="2026-03-13 21:13:31.045888001 +0000 UTC m=+2742.067003892" observedRunningTime="2026-03-13 21:13:32.850231069 +0000 UTC m=+2743.871346960" watchObservedRunningTime="2026-03-13 21:13:32.853206943 +0000 UTC m=+2743.874322824" Mar 13 21:13:44 crc kubenswrapper[4790]: I0313 21:13:44.015705 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:13:44 crc kubenswrapper[4790]: I0313 21:13:44.016202 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.146822 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.149834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.152539 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.152709 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.153191 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.157155 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.246943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"auto-csr-approver-29557274-6gbsg\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.349580 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"auto-csr-approver-29557274-6gbsg\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.370539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"auto-csr-approver-29557274-6gbsg\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.479318 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.911763 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:14:01 crc kubenswrapper[4790]: I0313 21:14:01.274011 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" event={"ID":"657afd7a-d901-4df2-96d4-239bf59388bd","Type":"ContainerStarted","Data":"940dcc75d9bf37c5e0b1e03ecf8787b3eb3a2b6a8a9f7ed159c34b124bbd9616"} Mar 13 21:14:03 crc kubenswrapper[4790]: I0313 21:14:03.291780 4790 generic.go:334] "Generic (PLEG): container finished" podID="657afd7a-d901-4df2-96d4-239bf59388bd" containerID="d277a0373c5a7461ab377865cd1179cec1bb76b46da5d05b6de42a92acf13b80" exitCode=0 Mar 13 21:14:03 crc kubenswrapper[4790]: I0313 21:14:03.292043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" event={"ID":"657afd7a-d901-4df2-96d4-239bf59388bd","Type":"ContainerDied","Data":"d277a0373c5a7461ab377865cd1179cec1bb76b46da5d05b6de42a92acf13b80"} Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.428104 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.434171 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.445291 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.529420 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.529716 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.529770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.630949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.631121 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.631820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.631222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.632055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.654302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.731350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.733094 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"657afd7a-d901-4df2-96d4-239bf59388bd\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.736516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr" (OuterVolumeSpecName: "kube-api-access-s4pjr") pod "657afd7a-d901-4df2-96d4-239bf59388bd" (UID: "657afd7a-d901-4df2-96d4-239bf59388bd"). InnerVolumeSpecName "kube-api-access-s4pjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.757279 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.835765 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:05 crc kubenswrapper[4790]: W0313 21:14:05.226745 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31292a3_5896_4bab_bd6e_bc45dffabc58.slice/crio-37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272 WatchSource:0}: Error finding container 37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272: Status 404 returned error can't find the container with id 37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272 Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.229973 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.323801 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerStarted","Data":"37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272"} Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.326763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" event={"ID":"657afd7a-d901-4df2-96d4-239bf59388bd","Type":"ContainerDied","Data":"940dcc75d9bf37c5e0b1e03ecf8787b3eb3a2b6a8a9f7ed159c34b124bbd9616"} Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.326824 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="940dcc75d9bf37c5e0b1e03ecf8787b3eb3a2b6a8a9f7ed159c34b124bbd9616" Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.326883 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.845180 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.864695 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:14:06 crc kubenswrapper[4790]: I0313 21:14:06.335308 4790 generic.go:334] "Generic (PLEG): container finished" podID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" exitCode=0 Mar 13 21:14:06 crc kubenswrapper[4790]: I0313 21:14:06.335406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e"} Mar 13 21:14:07 crc kubenswrapper[4790]: I0313 21:14:07.673228 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" path="/var/lib/kubelet/pods/5f962ddd-b18b-43c7-81e1-7eda48d64d88/volumes" Mar 13 21:14:08 crc kubenswrapper[4790]: I0313 21:14:08.357589 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerStarted","Data":"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac"} Mar 13 21:14:10 crc kubenswrapper[4790]: I0313 21:14:10.376478 4790 generic.go:334] "Generic (PLEG): container finished" podID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" exitCode=0 Mar 13 21:14:10 crc kubenswrapper[4790]: I0313 21:14:10.376567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac"} Mar 13 21:14:11 crc kubenswrapper[4790]: I0313 21:14:11.398988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerStarted","Data":"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a"} Mar 13 21:14:11 crc kubenswrapper[4790]: I0313 21:14:11.430825 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v9cjw" podStartSLOduration=2.741520109 podStartE2EDuration="7.430807037s" podCreationTimestamp="2026-03-13 21:14:04 +0000 UTC" firstStartedPulling="2026-03-13 21:14:06.337796709 +0000 UTC m=+2777.358912590" lastFinishedPulling="2026-03-13 21:14:11.027083627 +0000 UTC m=+2782.048199518" observedRunningTime="2026-03-13 21:14:11.423573723 +0000 UTC m=+2782.444689614" watchObservedRunningTime="2026-03-13 21:14:11.430807037 +0000 UTC m=+2782.451922928" Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.016156 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.016661 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.758110 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.758534 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:15 crc kubenswrapper[4790]: I0313 21:14:15.802633 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" probeResult="failure" output=< Mar 13 21:14:15 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:14:15 crc kubenswrapper[4790]: > Mar 13 21:14:18 crc kubenswrapper[4790]: I0313 21:14:18.831658 4790 scope.go:117] "RemoveContainer" containerID="bdca2f8da697e12973555a54d7d0753abfb943fd0d2919dd4adb4178a3e9c052" Mar 13 21:14:25 crc kubenswrapper[4790]: I0313 21:14:25.806437 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" probeResult="failure" output=< Mar 13 21:14:25 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:14:25 crc kubenswrapper[4790]: > Mar 13 21:14:35 crc kubenswrapper[4790]: I0313 21:14:35.800953 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" probeResult="failure" output=< Mar 13 21:14:35 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:14:35 crc kubenswrapper[4790]: > Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.016257 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.016877 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.016938 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.017747 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.017821 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795" gracePeriod=600 Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.684920 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795" exitCode=0 Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.684990 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795"} Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.685316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36"} Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.685339 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.808643 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.864700 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:45 crc kubenswrapper[4790]: I0313 21:14:45.047226 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:46 crc kubenswrapper[4790]: I0313 21:14:46.702165 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" containerID="cri-o://8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" gracePeriod=2 Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.201171 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.369883 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"f31292a3-5896-4bab-bd6e-bc45dffabc58\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.370018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"f31292a3-5896-4bab-bd6e-bc45dffabc58\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.370133 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"f31292a3-5896-4bab-bd6e-bc45dffabc58\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.370807 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities" (OuterVolumeSpecName: "utilities") pod "f31292a3-5896-4bab-bd6e-bc45dffabc58" (UID: "f31292a3-5896-4bab-bd6e-bc45dffabc58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.377460 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29" (OuterVolumeSpecName: "kube-api-access-dhc29") pod "f31292a3-5896-4bab-bd6e-bc45dffabc58" (UID: "f31292a3-5896-4bab-bd6e-bc45dffabc58"). InnerVolumeSpecName "kube-api-access-dhc29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.472190 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.472230 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.506336 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f31292a3-5896-4bab-bd6e-bc45dffabc58" (UID: "f31292a3-5896-4bab-bd6e-bc45dffabc58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.573511 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735098 4790 generic.go:334] "Generic (PLEG): container finished" podID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" exitCode=0 Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a"} Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735546 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272"} Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735567 4790 scope.go:117] "RemoveContainer" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735838 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.769978 4790 scope.go:117] "RemoveContainer" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.771234 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.799268 4790 scope.go:117] "RemoveContainer" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.802394 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.849884 4790 scope.go:117] "RemoveContainer" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" Mar 13 21:14:47 crc kubenswrapper[4790]: E0313 21:14:47.851030 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a\": container with ID starting with 8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a not found: ID does not exist" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.851101 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a"} err="failed to get container status \"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a\": rpc error: code = NotFound desc = could not find container \"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a\": container with ID starting with 8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a not found: ID does not exist" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.851156 4790 scope.go:117] "RemoveContainer" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" Mar 13 21:14:47 crc kubenswrapper[4790]: E0313 21:14:47.852036 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac\": container with ID starting with d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac not found: ID does not exist" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.852063 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac"} err="failed to get container status \"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac\": rpc error: code = NotFound desc = could not find container \"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac\": container with ID starting with d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac not found: ID does not exist" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.852079 4790 scope.go:117] "RemoveContainer" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" Mar 13 21:14:47 crc kubenswrapper[4790]: E0313 21:14:47.852829 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e\": container with ID starting with fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e not found: ID does not exist" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.852875 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e"} err="failed to get container status \"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e\": rpc error: code = NotFound desc = could not find container \"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e\": container with ID starting with fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e not found: ID does not exist" Mar 13 21:14:49 crc kubenswrapper[4790]: I0313 21:14:49.678124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" path="/var/lib/kubelet/pods/f31292a3-5896-4bab-bd6e-bc45dffabc58/volumes" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.138939 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j"] Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.139879 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" containerName="oc" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.139898 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" containerName="oc" Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.139937 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-utilities" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.139946 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-utilities" Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.139983 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-content" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.139991 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-content" Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.140001 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.140009 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.140218 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.140255 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" containerName="oc" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.141100 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.143129 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.144025 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.163825 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j"] Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.305089 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.305258 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.305456 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.407589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.407668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.407740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.408597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.413149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.423015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.465031 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.904021 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j"] Mar 13 21:15:01 crc kubenswrapper[4790]: I0313 21:15:01.866241 4790 generic.go:334] "Generic (PLEG): container finished" podID="d955a3c8-0b10-4040-8fc8-043862800b24" containerID="51a0b2215403cc456953ad266cc3365cf4785481bce1dd6c596170361ac34e20" exitCode=0 Mar 13 21:15:01 crc kubenswrapper[4790]: I0313 21:15:01.866311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" event={"ID":"d955a3c8-0b10-4040-8fc8-043862800b24","Type":"ContainerDied","Data":"51a0b2215403cc456953ad266cc3365cf4785481bce1dd6c596170361ac34e20"} Mar 13 21:15:01 crc kubenswrapper[4790]: I0313 21:15:01.866582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" event={"ID":"d955a3c8-0b10-4040-8fc8-043862800b24","Type":"ContainerStarted","Data":"356d142c1c345b4cc1b5d0c6e53ffc8fb48ebe85a954a45f5c6fda7d8f27ad0d"} Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.348857 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.464266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"d955a3c8-0b10-4040-8fc8-043862800b24\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.464623 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"d955a3c8-0b10-4040-8fc8-043862800b24\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.464759 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"d955a3c8-0b10-4040-8fc8-043862800b24\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.465125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume" (OuterVolumeSpecName: "config-volume") pod "d955a3c8-0b10-4040-8fc8-043862800b24" (UID: "d955a3c8-0b10-4040-8fc8-043862800b24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.465384 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.469629 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx" (OuterVolumeSpecName: "kube-api-access-2jdpx") pod "d955a3c8-0b10-4040-8fc8-043862800b24" (UID: "d955a3c8-0b10-4040-8fc8-043862800b24"). InnerVolumeSpecName "kube-api-access-2jdpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.470569 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d955a3c8-0b10-4040-8fc8-043862800b24" (UID: "d955a3c8-0b10-4040-8fc8-043862800b24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.567142 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.567183 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") on node \"crc\" DevicePath \"\"" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.885183 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" event={"ID":"d955a3c8-0b10-4040-8fc8-043862800b24","Type":"ContainerDied","Data":"356d142c1c345b4cc1b5d0c6e53ffc8fb48ebe85a954a45f5c6fda7d8f27ad0d"} Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.885231 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="356d142c1c345b4cc1b5d0c6e53ffc8fb48ebe85a954a45f5c6fda7d8f27ad0d" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.885241 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:04 crc kubenswrapper[4790]: I0313 21:15:04.425149 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 21:15:04 crc kubenswrapper[4790]: I0313 21:15:04.434190 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 21:15:05 crc kubenswrapper[4790]: I0313 21:15:05.671453 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" path="/var/lib/kubelet/pods/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51/volumes" Mar 13 21:15:18 crc kubenswrapper[4790]: I0313 21:15:18.916371 4790 scope.go:117] "RemoveContainer" containerID="93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.147163 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:16:00 crc kubenswrapper[4790]: E0313 21:16:00.148201 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d955a3c8-0b10-4040-8fc8-043862800b24" containerName="collect-profiles" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.148217 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d955a3c8-0b10-4040-8fc8-043862800b24" containerName="collect-profiles" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.148460 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d955a3c8-0b10-4040-8fc8-043862800b24" containerName="collect-profiles" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.149218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.152082 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.152613 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.153642 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.158191 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.183955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"auto-csr-approver-29557276-9dtrx\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.287422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"auto-csr-approver-29557276-9dtrx\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.305808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"auto-csr-approver-29557276-9dtrx\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.468038 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.938662 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:16:01 crc kubenswrapper[4790]: I0313 21:16:01.378605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerStarted","Data":"8e891209a6d777a268d6770601327cf5c9c35bc17f362059f286b01cbcf4ab2b"} Mar 13 21:16:02 crc kubenswrapper[4790]: I0313 21:16:02.393880 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerStarted","Data":"1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a"} Mar 13 21:16:02 crc kubenswrapper[4790]: I0313 21:16:02.413791 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" podStartSLOduration=1.277003166 podStartE2EDuration="2.413772539s" podCreationTimestamp="2026-03-13 21:16:00 +0000 UTC" firstStartedPulling="2026-03-13 21:16:00.946804299 +0000 UTC m=+2891.967920190" lastFinishedPulling="2026-03-13 21:16:02.083573672 +0000 UTC m=+2893.104689563" observedRunningTime="2026-03-13 21:16:02.412143634 +0000 UTC m=+2893.433259535" watchObservedRunningTime="2026-03-13 21:16:02.413772539 +0000 UTC m=+2893.434888430" Mar 13 21:16:03 crc kubenswrapper[4790]: I0313 21:16:03.404031 4790 generic.go:334] "Generic (PLEG): container finished" podID="1f2961af-f195-403c-bfa2-fd01638789d4" containerID="1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a" exitCode=0 Mar 13 21:16:03 crc kubenswrapper[4790]: I0313 21:16:03.404079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerDied","Data":"1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a"} Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.845746 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.874339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"1f2961af-f195-403c-bfa2-fd01638789d4\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.881090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5" (OuterVolumeSpecName: "kube-api-access-zvcf5") pod "1f2961af-f195-403c-bfa2-fd01638789d4" (UID: "1f2961af-f195-403c-bfa2-fd01638789d4"). InnerVolumeSpecName "kube-api-access-zvcf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.976228 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.423894 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerDied","Data":"8e891209a6d777a268d6770601327cf5c9c35bc17f362059f286b01cbcf4ab2b"} Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.423930 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e891209a6d777a268d6770601327cf5c9c35bc17f362059f286b01cbcf4ab2b" Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.423967 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.485425 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.492843 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.669655 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" path="/var/lib/kubelet/pods/68f751f6-8e31-448a-99e9-bf7f290684be/volumes" Mar 13 21:16:18 crc kubenswrapper[4790]: I0313 21:16:18.996511 4790 scope.go:117] "RemoveContainer" containerID="231c0b730759ce0ec6fa00fad0e521d17888055794c0179d0b2c116cf68aaf15" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.137211 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:30 crc kubenswrapper[4790]: E0313 21:16:30.138272 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" containerName="oc" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.138290 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" containerName="oc" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.138554 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" containerName="oc" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.140162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.154123 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.192942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.193094 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.193134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.294487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.294827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.294921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.295031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.295089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.314271 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.477262 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.982509 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:31 crc kubenswrapper[4790]: I0313 21:16:31.670463 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" exitCode=0 Mar 13 21:16:31 crc kubenswrapper[4790]: I0313 21:16:31.694659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15"} Mar 13 21:16:31 crc kubenswrapper[4790]: I0313 21:16:31.694697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerStarted","Data":"a3eae7ae0ac98bea414db557f54a72878ac40df96b07b869c33dd07262bd97bc"} Mar 13 21:16:32 crc kubenswrapper[4790]: I0313 21:16:32.681090 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" exitCode=0 Mar 13 21:16:32 crc kubenswrapper[4790]: I0313 21:16:32.681205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f"} Mar 13 21:16:33 crc kubenswrapper[4790]: I0313 21:16:33.696908 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerStarted","Data":"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec"} Mar 13 21:16:33 crc kubenswrapper[4790]: I0313 21:16:33.721335 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mngmq" podStartSLOduration=2.319261492 podStartE2EDuration="3.721312353s" podCreationTimestamp="2026-03-13 21:16:30 +0000 UTC" firstStartedPulling="2026-03-13 21:16:31.679864235 +0000 UTC m=+2922.700980126" lastFinishedPulling="2026-03-13 21:16:33.081915096 +0000 UTC m=+2924.103030987" observedRunningTime="2026-03-13 21:16:33.71357336 +0000 UTC m=+2924.734689251" watchObservedRunningTime="2026-03-13 21:16:33.721312353 +0000 UTC m=+2924.742428244" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.477629 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.478258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.527145 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.826419 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.876200 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:42 crc kubenswrapper[4790]: I0313 21:16:42.798115 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mngmq" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" containerID="cri-o://6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" gracePeriod=2 Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.267881 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.354358 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.354535 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.354663 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.358173 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities" (OuterVolumeSpecName: "utilities") pod "bd4a25b0-b765-448b-aebc-895c1e6a13ce" (UID: "bd4a25b0-b765-448b-aebc-895c1e6a13ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.366561 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw" (OuterVolumeSpecName: "kube-api-access-wh4pw") pod "bd4a25b0-b765-448b-aebc-895c1e6a13ce" (UID: "bd4a25b0-b765-448b-aebc-895c1e6a13ce"). InnerVolumeSpecName "kube-api-access-wh4pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.457441 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.457490 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807654 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" exitCode=0 Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec"} Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807952 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"a3eae7ae0ac98bea414db557f54a72878ac40df96b07b869c33dd07262bd97bc"} Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807976 4790 scope.go:117] "RemoveContainer" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807898 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.832953 4790 scope.go:117] "RemoveContainer" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.851951 4790 scope.go:117] "RemoveContainer" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.874805 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd4a25b0-b765-448b-aebc-895c1e6a13ce" (UID: "bd4a25b0-b765-448b-aebc-895c1e6a13ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.897736 4790 scope.go:117] "RemoveContainer" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" Mar 13 21:16:43 crc kubenswrapper[4790]: E0313 21:16:43.898110 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec\": container with ID starting with 6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec not found: ID does not exist" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898143 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec"} err="failed to get container status \"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec\": rpc error: code = NotFound desc = could not find container \"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec\": container with ID starting with 6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec not found: ID does not exist" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898161 4790 scope.go:117] "RemoveContainer" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" Mar 13 21:16:43 crc kubenswrapper[4790]: E0313 21:16:43.898622 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f\": container with ID starting with 5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f not found: ID does not exist" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898672 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f"} err="failed to get container status \"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f\": rpc error: code = NotFound desc = could not find container \"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f\": container with ID starting with 5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f not found: ID does not exist" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898708 4790 scope.go:117] "RemoveContainer" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" Mar 13 21:16:43 crc kubenswrapper[4790]: E0313 21:16:43.899005 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15\": container with ID starting with 09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15 not found: ID does not exist" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.899031 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15"} err="failed to get container status \"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15\": rpc error: code = NotFound desc = could not find container \"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15\": container with ID starting with 09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15 not found: ID does not exist" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.967444 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.015531 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.015585 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.139135 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.149800 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:45 crc kubenswrapper[4790]: I0313 21:16:45.671271 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" path="/var/lib/kubelet/pods/bd4a25b0-b765-448b-aebc-895c1e6a13ce/volumes" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.556074 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:05 crc kubenswrapper[4790]: E0313 21:17:05.557111 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-utilities" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557134 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-utilities" Mar 13 21:17:05 crc kubenswrapper[4790]: E0313 21:17:05.557156 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-content" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557164 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-content" Mar 13 21:17:05 crc kubenswrapper[4790]: E0313 21:17:05.557197 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557204 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557453 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.559117 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.567004 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.661399 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.661487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.661624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.763543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.763643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.763693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.764155 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.764921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.789310 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.880632 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:06 crc kubenswrapper[4790]: I0313 21:17:06.390443 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.000778 4790 generic.go:334] "Generic (PLEG): container finished" podID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" exitCode=0 Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.000815 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b"} Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.000846 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerStarted","Data":"6b0fb3482331fe9ebb6936e0aa0a4fc44603cc49316d776e27cc517752a36fee"} Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.002537 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:17:08 crc kubenswrapper[4790]: I0313 21:17:08.010736 4790 generic.go:334] "Generic (PLEG): container finished" podID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" exitCode=0 Mar 13 21:17:08 crc kubenswrapper[4790]: I0313 21:17:08.010852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e"} Mar 13 21:17:09 crc kubenswrapper[4790]: I0313 21:17:09.022111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerStarted","Data":"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42"} Mar 13 21:17:09 crc kubenswrapper[4790]: I0313 21:17:09.040808 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvlm5" podStartSLOduration=2.609373885 podStartE2EDuration="4.040787341s" podCreationTimestamp="2026-03-13 21:17:05 +0000 UTC" firstStartedPulling="2026-03-13 21:17:07.002298673 +0000 UTC m=+2958.023414554" lastFinishedPulling="2026-03-13 21:17:08.433712119 +0000 UTC m=+2959.454828010" observedRunningTime="2026-03-13 21:17:09.03929218 +0000 UTC m=+2960.060408071" watchObservedRunningTime="2026-03-13 21:17:09.040787341 +0000 UTC m=+2960.061903242" Mar 13 21:17:14 crc kubenswrapper[4790]: I0313 21:17:14.016047 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:17:14 crc kubenswrapper[4790]: I0313 21:17:14.016517 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:17:15 crc kubenswrapper[4790]: I0313 21:17:15.881817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:15 crc kubenswrapper[4790]: I0313 21:17:15.882187 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:15 crc kubenswrapper[4790]: I0313 21:17:15.928859 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:16 crc kubenswrapper[4790]: I0313 21:17:16.121240 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:16 crc kubenswrapper[4790]: I0313 21:17:16.163276 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.093190 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvlm5" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" containerID="cri-o://11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" gracePeriod=2 Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.600454 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.699408 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.699771 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.699958 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.700973 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities" (OuterVolumeSpecName: "utilities") pod "848a94d0-7273-4bd8-a9a3-37a0c83d021d" (UID: "848a94d0-7273-4bd8-a9a3-37a0c83d021d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.710902 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6" (OuterVolumeSpecName: "kube-api-access-zkpd6") pod "848a94d0-7273-4bd8-a9a3-37a0c83d021d" (UID: "848a94d0-7273-4bd8-a9a3-37a0c83d021d"). InnerVolumeSpecName "kube-api-access-zkpd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.749540 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "848a94d0-7273-4bd8-a9a3-37a0c83d021d" (UID: "848a94d0-7273-4bd8-a9a3-37a0c83d021d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.802548 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.804283 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.804458 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") on node \"crc\" DevicePath \"\"" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.102775 4790 generic.go:334] "Generic (PLEG): container finished" podID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" exitCode=0 Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.102856 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.102868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42"} Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.104025 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"6b0fb3482331fe9ebb6936e0aa0a4fc44603cc49316d776e27cc517752a36fee"} Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.104044 4790 scope.go:117] "RemoveContainer" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.123069 4790 scope.go:117] "RemoveContainer" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.145245 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.155682 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.169370 4790 scope.go:117] "RemoveContainer" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.196258 4790 scope.go:117] "RemoveContainer" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" Mar 13 21:17:19 crc kubenswrapper[4790]: E0313 21:17:19.196720 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42\": container with ID starting with 11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42 not found: ID does not exist" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.196832 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42"} err="failed to get container status \"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42\": rpc error: code = NotFound desc = could not find container \"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42\": container with ID starting with 11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42 not found: ID does not exist" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.196913 4790 scope.go:117] "RemoveContainer" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" Mar 13 21:17:19 crc kubenswrapper[4790]: E0313 21:17:19.197497 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e\": container with ID starting with 9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e not found: ID does not exist" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.197608 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e"} err="failed to get container status \"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e\": rpc error: code = NotFound desc = could not find container \"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e\": container with ID starting with 9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e not found: ID does not exist" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.197671 4790 scope.go:117] "RemoveContainer" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" Mar 13 21:17:19 crc kubenswrapper[4790]: E0313 21:17:19.197971 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b\": container with ID starting with 6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b not found: ID does not exist" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.198005 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b"} err="failed to get container status \"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b\": rpc error: code = NotFound desc = could not find container \"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b\": container with ID starting with 6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b not found: ID does not exist" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.682322 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" path="/var/lib/kubelet/pods/848a94d0-7273-4bd8-a9a3-37a0c83d021d/volumes" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.015484 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.016070 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.016121 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.016943 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.017001 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" gracePeriod=600 Mar 13 21:17:44 crc kubenswrapper[4790]: E0313 21:17:44.139601 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317278 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" exitCode=0 Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317321 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36"} Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317358 4790 scope.go:117] "RemoveContainer" containerID="75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317975 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:17:44 crc kubenswrapper[4790]: E0313 21:17:44.318223 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:17:58 crc kubenswrapper[4790]: I0313 21:17:58.659697 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:17:58 crc kubenswrapper[4790]: E0313 21:17:58.660540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.143219 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:18:00 crc kubenswrapper[4790]: E0313 21:18:00.144005 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-utilities" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144028 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-utilities" Mar 13 21:18:00 crc kubenswrapper[4790]: E0313 21:18:00.144058 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144070 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" Mar 13 21:18:00 crc kubenswrapper[4790]: E0313 21:18:00.144093 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-content" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144106 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-content" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144432 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.145236 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.147553 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.147632 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.147693 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.160356 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.299165 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"auto-csr-approver-29557278-6n5t9\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.401744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"auto-csr-approver-29557278-6n5t9\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.426256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"auto-csr-approver-29557278-6n5t9\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.465396 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.895732 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:18:01 crc kubenswrapper[4790]: I0313 21:18:01.470017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" event={"ID":"4d317138-8c0e-4824-a6b0-c25bb9b79631","Type":"ContainerStarted","Data":"adc434aa7408eebe3d38ea736ed275008af1dcbea2dae8dadcd58a62aa08bd3c"} Mar 13 21:18:02 crc kubenswrapper[4790]: I0313 21:18:02.478624 4790 generic.go:334] "Generic (PLEG): container finished" podID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerID="9bdff7a81ff2a9a8995b79476629c9294b76419c09baf5ddb2aac9365620522e" exitCode=0 Mar 13 21:18:02 crc kubenswrapper[4790]: I0313 21:18:02.478687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" event={"ID":"4d317138-8c0e-4824-a6b0-c25bb9b79631","Type":"ContainerDied","Data":"9bdff7a81ff2a9a8995b79476629c9294b76419c09baf5ddb2aac9365620522e"} Mar 13 21:18:03 crc kubenswrapper[4790]: I0313 21:18:03.843228 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:03 crc kubenswrapper[4790]: I0313 21:18:03.975649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"4d317138-8c0e-4824-a6b0-c25bb9b79631\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " Mar 13 21:18:03 crc kubenswrapper[4790]: I0313 21:18:03.983945 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9" (OuterVolumeSpecName: "kube-api-access-7l2c9") pod "4d317138-8c0e-4824-a6b0-c25bb9b79631" (UID: "4d317138-8c0e-4824-a6b0-c25bb9b79631"). InnerVolumeSpecName "kube-api-access-7l2c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.078613 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") on node \"crc\" DevicePath \"\"" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.499957 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" event={"ID":"4d317138-8c0e-4824-a6b0-c25bb9b79631","Type":"ContainerDied","Data":"adc434aa7408eebe3d38ea736ed275008af1dcbea2dae8dadcd58a62aa08bd3c"} Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.500011 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc434aa7408eebe3d38ea736ed275008af1dcbea2dae8dadcd58a62aa08bd3c" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.500010 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.907601 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.916955 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:18:05 crc kubenswrapper[4790]: I0313 21:18:05.669124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" path="/var/lib/kubelet/pods/7b51130e-f39a-4a4c-b41e-c865e51004dd/volumes" Mar 13 21:18:13 crc kubenswrapper[4790]: I0313 21:18:13.660537 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:13 crc kubenswrapper[4790]: E0313 21:18:13.661424 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:19 crc kubenswrapper[4790]: I0313 21:18:19.101243 4790 scope.go:117] "RemoveContainer" containerID="4ea25a336829635b84ca0d8e478c73129cc595166d50214c193658a79404456f" Mar 13 21:18:24 crc kubenswrapper[4790]: I0313 21:18:24.659931 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:24 crc kubenswrapper[4790]: E0313 21:18:24.660731 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:39 crc kubenswrapper[4790]: I0313 21:18:39.666207 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:39 crc kubenswrapper[4790]: E0313 21:18:39.666770 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:54 crc kubenswrapper[4790]: I0313 21:18:54.659805 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:54 crc kubenswrapper[4790]: E0313 21:18:54.660710 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:05 crc kubenswrapper[4790]: I0313 21:19:05.660645 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:05 crc kubenswrapper[4790]: E0313 21:19:05.661432 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:16 crc kubenswrapper[4790]: I0313 21:19:16.659895 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:16 crc kubenswrapper[4790]: E0313 21:19:16.660752 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:31 crc kubenswrapper[4790]: I0313 21:19:31.659568 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:31 crc kubenswrapper[4790]: E0313 21:19:31.660503 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:43 crc kubenswrapper[4790]: I0313 21:19:43.660413 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:43 crc kubenswrapper[4790]: E0313 21:19:43.661336 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:58 crc kubenswrapper[4790]: I0313 21:19:58.660241 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:58 crc kubenswrapper[4790]: E0313 21:19:58.660974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.140494 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:20:00 crc kubenswrapper[4790]: E0313 21:20:00.141221 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerName="oc" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.141239 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerName="oc" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.141473 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerName="oc" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.142128 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.144196 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.144598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.144828 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.152748 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.225363 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"auto-csr-approver-29557280-f6wtq\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.327069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"auto-csr-approver-29557280-f6wtq\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.349587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"auto-csr-approver-29557280-f6wtq\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.463245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.917869 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:20:01 crc kubenswrapper[4790]: I0313 21:20:01.454989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" event={"ID":"c215866d-1e07-4033-8e8a-d7826692bc76","Type":"ContainerStarted","Data":"2c754481c726d5c0d95efcd226007d3a00ebe6f209b3c71e8b4439a056f23e8c"} Mar 13 21:20:03 crc kubenswrapper[4790]: I0313 21:20:03.478825 4790 generic.go:334] "Generic (PLEG): container finished" podID="c215866d-1e07-4033-8e8a-d7826692bc76" containerID="1dfb1a39dcbf9770c39e6abee624c19e7caa14a0b69762f480ec12e76586b37f" exitCode=0 Mar 13 21:20:03 crc kubenswrapper[4790]: I0313 21:20:03.478932 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" event={"ID":"c215866d-1e07-4033-8e8a-d7826692bc76","Type":"ContainerDied","Data":"1dfb1a39dcbf9770c39e6abee624c19e7caa14a0b69762f480ec12e76586b37f"} Mar 13 21:20:04 crc kubenswrapper[4790]: I0313 21:20:04.858157 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.015334 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"c215866d-1e07-4033-8e8a-d7826692bc76\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.021445 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5" (OuterVolumeSpecName: "kube-api-access-zjnb5") pod "c215866d-1e07-4033-8e8a-d7826692bc76" (UID: "c215866d-1e07-4033-8e8a-d7826692bc76"). InnerVolumeSpecName "kube-api-access-zjnb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.117355 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") on node \"crc\" DevicePath \"\"" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.503033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" event={"ID":"c215866d-1e07-4033-8e8a-d7826692bc76","Type":"ContainerDied","Data":"2c754481c726d5c0d95efcd226007d3a00ebe6f209b3c71e8b4439a056f23e8c"} Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.503310 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c754481c726d5c0d95efcd226007d3a00ebe6f209b3c71e8b4439a056f23e8c" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.503104 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.939852 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.953751 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:20:07 crc kubenswrapper[4790]: I0313 21:20:07.672002 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" path="/var/lib/kubelet/pods/657afd7a-d901-4df2-96d4-239bf59388bd/volumes" Mar 13 21:20:13 crc kubenswrapper[4790]: I0313 21:20:13.660477 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:13 crc kubenswrapper[4790]: E0313 21:20:13.661320 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:19 crc kubenswrapper[4790]: I0313 21:20:19.212513 4790 scope.go:117] "RemoveContainer" containerID="d277a0373c5a7461ab377865cd1179cec1bb76b46da5d05b6de42a92acf13b80" Mar 13 21:20:25 crc kubenswrapper[4790]: I0313 21:20:25.659824 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:25 crc kubenswrapper[4790]: E0313 21:20:25.660663 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:37 crc kubenswrapper[4790]: I0313 21:20:37.660401 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:37 crc kubenswrapper[4790]: E0313 21:20:37.661211 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:52 crc kubenswrapper[4790]: I0313 21:20:52.661438 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:52 crc kubenswrapper[4790]: E0313 21:20:52.662246 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:06 crc kubenswrapper[4790]: I0313 21:21:06.660705 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:06 crc kubenswrapper[4790]: E0313 21:21:06.661639 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:17 crc kubenswrapper[4790]: I0313 21:21:17.660113 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:17 crc kubenswrapper[4790]: E0313 21:21:17.660859 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:28 crc kubenswrapper[4790]: I0313 21:21:28.661259 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:28 crc kubenswrapper[4790]: E0313 21:21:28.662066 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:43 crc kubenswrapper[4790]: I0313 21:21:43.660703 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:43 crc kubenswrapper[4790]: E0313 21:21:43.661659 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:57 crc kubenswrapper[4790]: I0313 21:21:57.660124 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:57 crc kubenswrapper[4790]: E0313 21:21:57.661737 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.140627 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:22:00 crc kubenswrapper[4790]: E0313 21:22:00.141879 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" containerName="oc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.141899 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" containerName="oc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.142126 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" containerName="oc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.142875 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.145057 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.145334 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.145579 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.151918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.190729 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"auto-csr-approver-29557282-h6blc\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.292745 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"auto-csr-approver-29557282-h6blc\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.311967 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"auto-csr-approver-29557282-h6blc\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.511156 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.990323 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:22:01 crc kubenswrapper[4790]: I0313 21:22:01.379757 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerStarted","Data":"8a86fcaafe9bdb23070e68c0a31f33e4af357b0443043182911a908590e57eb0"} Mar 13 21:22:03 crc kubenswrapper[4790]: I0313 21:22:03.397652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerStarted","Data":"5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0"} Mar 13 21:22:03 crc kubenswrapper[4790]: I0313 21:22:03.409441 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557282-h6blc" podStartSLOduration=1.422196846 podStartE2EDuration="3.409423333s" podCreationTimestamp="2026-03-13 21:22:00 +0000 UTC" firstStartedPulling="2026-03-13 21:22:01.007885201 +0000 UTC m=+3252.029001092" lastFinishedPulling="2026-03-13 21:22:02.995111688 +0000 UTC m=+3254.016227579" observedRunningTime="2026-03-13 21:22:03.407738966 +0000 UTC m=+3254.428854847" watchObservedRunningTime="2026-03-13 21:22:03.409423333 +0000 UTC m=+3254.430539224" Mar 13 21:22:04 crc kubenswrapper[4790]: I0313 21:22:04.408968 4790 generic.go:334] "Generic (PLEG): container finished" podID="e4a5c228-86a3-4945-8d95-44db739406d7" containerID="5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0" exitCode=0 Mar 13 21:22:04 crc kubenswrapper[4790]: I0313 21:22:04.409049 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerDied","Data":"5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0"} Mar 13 21:22:05 crc kubenswrapper[4790]: I0313 21:22:05.826635 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:05 crc kubenswrapper[4790]: I0313 21:22:05.901863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"e4a5c228-86a3-4945-8d95-44db739406d7\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " Mar 13 21:22:05 crc kubenswrapper[4790]: I0313 21:22:05.907797 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc" (OuterVolumeSpecName: "kube-api-access-h2gmc") pod "e4a5c228-86a3-4945-8d95-44db739406d7" (UID: "e4a5c228-86a3-4945-8d95-44db739406d7"). InnerVolumeSpecName "kube-api-access-h2gmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.004843 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") on node \"crc\" DevicePath \"\"" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.433017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerDied","Data":"8a86fcaafe9bdb23070e68c0a31f33e4af357b0443043182911a908590e57eb0"} Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.433394 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a86fcaafe9bdb23070e68c0a31f33e4af357b0443043182911a908590e57eb0" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.433100 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.504319 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.516049 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:22:07 crc kubenswrapper[4790]: I0313 21:22:07.670794 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" path="/var/lib/kubelet/pods/1f2961af-f195-403c-bfa2-fd01638789d4/volumes" Mar 13 21:22:10 crc kubenswrapper[4790]: I0313 21:22:10.660726 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:10 crc kubenswrapper[4790]: E0313 21:22:10.661556 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:19 crc kubenswrapper[4790]: I0313 21:22:19.321275 4790 scope.go:117] "RemoveContainer" containerID="1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a" Mar 13 21:22:22 crc kubenswrapper[4790]: I0313 21:22:22.660128 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:22 crc kubenswrapper[4790]: E0313 21:22:22.660770 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:34 crc kubenswrapper[4790]: I0313 21:22:34.660308 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:34 crc kubenswrapper[4790]: E0313 21:22:34.661071 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:49 crc kubenswrapper[4790]: I0313 21:22:49.668501 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:50 crc kubenswrapper[4790]: I0313 21:22:50.812114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7"} Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.488344 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:31 crc kubenswrapper[4790]: E0313 21:23:31.489452 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" containerName="oc" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.489470 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" containerName="oc" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.489740 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" containerName="oc" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.494685 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.529918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.653130 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-utilities\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.653205 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-catalog-content\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.653316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtd2p\" (UniqueName: \"kubernetes.io/projected/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-kube-api-access-gtd2p\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.754844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-catalog-content\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.754974 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtd2p\" (UniqueName: \"kubernetes.io/projected/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-kube-api-access-gtd2p\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.755134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-utilities\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.755982 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-catalog-content\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.756693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-utilities\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.779434 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtd2p\" (UniqueName: \"kubernetes.io/projected/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-kube-api-access-gtd2p\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.853443 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:32 crc kubenswrapper[4790]: I0313 21:23:32.156049 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.155404 4790 generic.go:334] "Generic (PLEG): container finished" podID="2ab722e2-16ac-40ba-9c44-903bf6bb8db8" containerID="c9e3387e3b57059a7c47d9d4c2339fa61974767686b0ed4a8f9abdc3174ca87d" exitCode=0 Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.155455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerDied","Data":"c9e3387e3b57059a7c47d9d4c2339fa61974767686b0ed4a8f9abdc3174ca87d"} Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.156418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerStarted","Data":"45d8ec9a6d95741fa5bd7264d04cc763f01fa0de802d313de423dceeb441c88d"} Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.157778 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:23:37 crc kubenswrapper[4790]: I0313 21:23:37.190610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerStarted","Data":"0a2b4475cb0e313c5d697bae625925d992fb7842bebfe72a72b3212d4e51639e"} Mar 13 21:23:38 crc kubenswrapper[4790]: I0313 21:23:38.204077 4790 generic.go:334] "Generic (PLEG): container finished" podID="2ab722e2-16ac-40ba-9c44-903bf6bb8db8" containerID="0a2b4475cb0e313c5d697bae625925d992fb7842bebfe72a72b3212d4e51639e" exitCode=0 Mar 13 21:23:38 crc kubenswrapper[4790]: I0313 21:23:38.204115 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerDied","Data":"0a2b4475cb0e313c5d697bae625925d992fb7842bebfe72a72b3212d4e51639e"} Mar 13 21:23:39 crc kubenswrapper[4790]: I0313 21:23:39.215079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerStarted","Data":"4ff8fa23aca6fe1e534c846faa65ea12116ad45ec2e0a9c13d4d0d98ed73111a"} Mar 13 21:23:39 crc kubenswrapper[4790]: I0313 21:23:39.236541 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qcdqx" podStartSLOduration=2.701455642 podStartE2EDuration="8.236518439s" podCreationTimestamp="2026-03-13 21:23:31 +0000 UTC" firstStartedPulling="2026-03-13 21:23:33.15754536 +0000 UTC m=+3344.178661241" lastFinishedPulling="2026-03-13 21:23:38.692608147 +0000 UTC m=+3349.713724038" observedRunningTime="2026-03-13 21:23:39.233705672 +0000 UTC m=+3350.254821593" watchObservedRunningTime="2026-03-13 21:23:39.236518439 +0000 UTC m=+3350.257634330" Mar 13 21:23:41 crc kubenswrapper[4790]: I0313 21:23:41.854151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:41 crc kubenswrapper[4790]: I0313 21:23:41.854791 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:41 crc kubenswrapper[4790]: I0313 21:23:41.923358 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.298351 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.388532 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.435723 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.435959 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cpxlj" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" containerID="cri-o://1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" gracePeriod=2 Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.920108 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.005015 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"62b23203-ded5-4b14-8a86-89c3ce3e33df\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.005094 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"62b23203-ded5-4b14-8a86-89c3ce3e33df\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.005190 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"62b23203-ded5-4b14-8a86-89c3ce3e33df\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.007820 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities" (OuterVolumeSpecName: "utilities") pod "62b23203-ded5-4b14-8a86-89c3ce3e33df" (UID: "62b23203-ded5-4b14-8a86-89c3ce3e33df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.015958 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv" (OuterVolumeSpecName: "kube-api-access-z7qhv") pod "62b23203-ded5-4b14-8a86-89c3ce3e33df" (UID: "62b23203-ded5-4b14-8a86-89c3ce3e33df"). InnerVolumeSpecName "kube-api-access-z7qhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.076607 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62b23203-ded5-4b14-8a86-89c3ce3e33df" (UID: "62b23203-ded5-4b14-8a86-89c3ce3e33df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.107684 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.107722 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.107734 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262238 4790 generic.go:334] "Generic (PLEG): container finished" podID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" exitCode=0 Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262293 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5"} Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262354 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"abfa15f6de4daed047e18e5a602cd0577d104072963eda4b67a1d006df7fb930"} Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262398 4790 scope.go:117] "RemoveContainer" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.301435 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.311038 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.321320 4790 scope.go:117] "RemoveContainer" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.369103 4790 scope.go:117] "RemoveContainer" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.403090 4790 scope.go:117] "RemoveContainer" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" Mar 13 21:23:44 crc kubenswrapper[4790]: E0313 21:23:44.403568 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5\": container with ID starting with 1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5 not found: ID does not exist" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.403617 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5"} err="failed to get container status \"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5\": rpc error: code = NotFound desc = could not find container \"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5\": container with ID starting with 1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5 not found: ID does not exist" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.403642 4790 scope.go:117] "RemoveContainer" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" Mar 13 21:23:44 crc kubenswrapper[4790]: E0313 21:23:44.403997 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2\": container with ID starting with abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2 not found: ID does not exist" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.404036 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2"} err="failed to get container status \"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2\": rpc error: code = NotFound desc = could not find container \"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2\": container with ID starting with abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2 not found: ID does not exist" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.404090 4790 scope.go:117] "RemoveContainer" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" Mar 13 21:23:44 crc kubenswrapper[4790]: E0313 21:23:44.404343 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc\": container with ID starting with 425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc not found: ID does not exist" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.404368 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc"} err="failed to get container status \"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc\": rpc error: code = NotFound desc = could not find container \"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc\": container with ID starting with 425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc not found: ID does not exist" Mar 13 21:23:45 crc kubenswrapper[4790]: I0313 21:23:45.671003 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" path="/var/lib/kubelet/pods/62b23203-ded5-4b14-8a86-89c3ce3e33df/volumes" Mar 13 21:23:47 crc kubenswrapper[4790]: I0313 21:23:47.289015 4790 generic.go:334] "Generic (PLEG): container finished" podID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerID="5f3dc8212dd652060ecb9c9d45ce324d2168353ccf633608e4415a58fb8949f8" exitCode=0 Mar 13 21:23:47 crc kubenswrapper[4790]: I0313 21:23:47.289129 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerDied","Data":"5f3dc8212dd652060ecb9c9d45ce324d2168353ccf633608e4415a58fb8949f8"} Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.628959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.693767 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.693870 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.693943 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694038 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694196 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694238 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694327 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.702094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.702488 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr" (OuterVolumeSpecName: "kube-api-access-pbzlr") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "kube-api-access-pbzlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.702722 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.703479 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data" (OuterVolumeSpecName: "config-data") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.708579 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.726774 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.740647 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.741506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.750245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800030 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800074 4790 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800091 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800104 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800117 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800141 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800154 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800166 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800179 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.820371 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.902355 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:49 crc kubenswrapper[4790]: I0313 21:23:49.317762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerDied","Data":"9ae048789cc06b95d8d9a690f59586791ebaca094ac82840d0dd227be9680876"} Mar 13 21:23:49 crc kubenswrapper[4790]: I0313 21:23:49.317824 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ae048789cc06b95d8d9a690f59586791ebaca094ac82840d0dd227be9680876" Mar 13 21:23:49 crc kubenswrapper[4790]: I0313 21:23:49.317911 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.170654 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173076 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-content" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173192 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-content" Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173330 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173445 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173547 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerName="tempest-tests-tempest-tests-runner" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173644 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerName="tempest-tests-tempest-tests-runner" Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173738 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-utilities" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173824 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-utilities" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.174184 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.174300 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerName="tempest-tests-tempest-tests-runner" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.175351 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.178296 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.178423 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.179002 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.181185 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.262555 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"auto-csr-approver-29557284-fszb4\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.364782 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"auto-csr-approver-29557284-fszb4\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.389683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"auto-csr-approver-29557284-fszb4\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.498677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.944725 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:24:01 crc kubenswrapper[4790]: I0313 21:24:01.422573 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerStarted","Data":"271547d8b5572b272e3875c751e4a0cfb77044bb630cfbf410e2540184ce24db"} Mar 13 21:24:02 crc kubenswrapper[4790]: I0313 21:24:02.431366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerStarted","Data":"b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3"} Mar 13 21:24:02 crc kubenswrapper[4790]: I0313 21:24:02.453371 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557284-fszb4" podStartSLOduration=1.396279386 podStartE2EDuration="2.453353946s" podCreationTimestamp="2026-03-13 21:24:00 +0000 UTC" firstStartedPulling="2026-03-13 21:24:00.953002897 +0000 UTC m=+3371.974118788" lastFinishedPulling="2026-03-13 21:24:02.010077457 +0000 UTC m=+3373.031193348" observedRunningTime="2026-03-13 21:24:02.448108343 +0000 UTC m=+3373.469224234" watchObservedRunningTime="2026-03-13 21:24:02.453353946 +0000 UTC m=+3373.474469837" Mar 13 21:24:03 crc kubenswrapper[4790]: I0313 21:24:03.442647 4790 generic.go:334] "Generic (PLEG): container finished" podID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerID="b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3" exitCode=0 Mar 13 21:24:03 crc kubenswrapper[4790]: I0313 21:24:03.442730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerDied","Data":"b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3"} Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.797164 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.879808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"e71263f0-7309-4046-b71d-2ae38e13d27c\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.893192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp" (OuterVolumeSpecName: "kube-api-access-snbfp") pod "e71263f0-7309-4046-b71d-2ae38e13d27c" (UID: "e71263f0-7309-4046-b71d-2ae38e13d27c"). InnerVolumeSpecName "kube-api-access-snbfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.982714 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.459151 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerDied","Data":"271547d8b5572b272e3875c751e4a0cfb77044bb630cfbf410e2540184ce24db"} Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.459201 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.459212 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="271547d8b5572b272e3875c751e4a0cfb77044bb630cfbf410e2540184ce24db" Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.528421 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.538117 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.669912 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" path="/var/lib/kubelet/pods/4d317138-8c0e-4824-a6b0-c25bb9b79631/volumes" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.317898 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:06 crc kubenswrapper[4790]: E0313 21:24:06.318991 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerName="oc" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.319024 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerName="oc" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.319222 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerName="oc" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.321042 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.328665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.408598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.408769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.408848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.511505 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.511635 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.511701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.512504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.512715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.532684 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.643309 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.107316 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.487800 4790 generic.go:334] "Generic (PLEG): container finished" podID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" exitCode=0 Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.488083 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd"} Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.488112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerStarted","Data":"a27118c7da3719bfb8ec9d74d4e9c7353a1e0f88365aff82b5f056ddcc2d1492"} Mar 13 21:24:09 crc kubenswrapper[4790]: I0313 21:24:09.509319 4790 generic.go:334] "Generic (PLEG): container finished" podID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" exitCode=0 Mar 13 21:24:09 crc kubenswrapper[4790]: I0313 21:24:09.509407 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f"} Mar 13 21:24:10 crc kubenswrapper[4790]: I0313 21:24:10.520803 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerStarted","Data":"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55"} Mar 13 21:24:10 crc kubenswrapper[4790]: I0313 21:24:10.541953 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smm9t" podStartSLOduration=1.991387137 podStartE2EDuration="4.541922697s" podCreationTimestamp="2026-03-13 21:24:06 +0000 UTC" firstStartedPulling="2026-03-13 21:24:07.502830943 +0000 UTC m=+3378.523946834" lastFinishedPulling="2026-03-13 21:24:10.053366463 +0000 UTC m=+3381.074482394" observedRunningTime="2026-03-13 21:24:10.537178876 +0000 UTC m=+3381.558294777" watchObservedRunningTime="2026-03-13 21:24:10.541922697 +0000 UTC m=+3381.563038588" Mar 13 21:24:16 crc kubenswrapper[4790]: I0313 21:24:16.643642 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:16 crc kubenswrapper[4790]: I0313 21:24:16.652583 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:17 crc kubenswrapper[4790]: I0313 21:24:17.694046 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smm9t" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" probeResult="failure" output=< Mar 13 21:24:17 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:24:17 crc kubenswrapper[4790]: > Mar 13 21:24:19 crc kubenswrapper[4790]: I0313 21:24:19.414260 4790 scope.go:117] "RemoveContainer" containerID="9bdff7a81ff2a9a8995b79476629c9294b76419c09baf5ddb2aac9365620522e" Mar 13 21:24:26 crc kubenswrapper[4790]: I0313 21:24:26.691932 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:26 crc kubenswrapper[4790]: I0313 21:24:26.742262 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:26 crc kubenswrapper[4790]: I0313 21:24:26.928150 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:28 crc kubenswrapper[4790]: I0313 21:24:28.710294 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smm9t" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" containerID="cri-o://c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" gracePeriod=2 Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.133723 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"e47c235d-1228-4ded-9bc0-5dc34e05572f\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269229 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"e47c235d-1228-4ded-9bc0-5dc34e05572f\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269334 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"e47c235d-1228-4ded-9bc0-5dc34e05572f\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269998 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities" (OuterVolumeSpecName: "utilities") pod "e47c235d-1228-4ded-9bc0-5dc34e05572f" (UID: "e47c235d-1228-4ded-9bc0-5dc34e05572f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.275123 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz" (OuterVolumeSpecName: "kube-api-access-4vqcz") pod "e47c235d-1228-4ded-9bc0-5dc34e05572f" (UID: "e47c235d-1228-4ded-9bc0-5dc34e05572f"). InnerVolumeSpecName "kube-api-access-4vqcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.372131 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.372621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.394688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e47c235d-1228-4ded-9bc0-5dc34e05572f" (UID: "e47c235d-1228-4ded-9bc0-5dc34e05572f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.474028 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722627 4790 generic.go:334] "Generic (PLEG): container finished" podID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" exitCode=0 Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55"} Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722718 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"a27118c7da3719bfb8ec9d74d4e9c7353a1e0f88365aff82b5f056ddcc2d1492"} Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722741 4790 scope.go:117] "RemoveContainer" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722774 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.751764 4790 scope.go:117] "RemoveContainer" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.762291 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.772994 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.775298 4790 scope.go:117] "RemoveContainer" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.826361 4790 scope.go:117] "RemoveContainer" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" Mar 13 21:24:29 crc kubenswrapper[4790]: E0313 21:24:29.826989 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55\": container with ID starting with c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55 not found: ID does not exist" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827032 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55"} err="failed to get container status \"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55\": rpc error: code = NotFound desc = could not find container \"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55\": container with ID starting with c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55 not found: ID does not exist" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827059 4790 scope.go:117] "RemoveContainer" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" Mar 13 21:24:29 crc kubenswrapper[4790]: E0313 21:24:29.827413 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f\": container with ID starting with b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f not found: ID does not exist" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827436 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f"} err="failed to get container status \"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f\": rpc error: code = NotFound desc = could not find container \"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f\": container with ID starting with b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f not found: ID does not exist" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827448 4790 scope.go:117] "RemoveContainer" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" Mar 13 21:24:29 crc kubenswrapper[4790]: E0313 21:24:29.829561 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd\": container with ID starting with 5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd not found: ID does not exist" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.829631 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd"} err="failed to get container status \"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd\": rpc error: code = NotFound desc = could not find container \"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd\": container with ID starting with 5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd not found: ID does not exist" Mar 13 21:24:31 crc kubenswrapper[4790]: I0313 21:24:31.672718 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" path="/var/lib/kubelet/pods/e47c235d-1228-4ded-9bc0-5dc34e05572f/volumes" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.466289 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:24:37 crc kubenswrapper[4790]: E0313 21:24:37.467222 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467237 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" Mar 13 21:24:37 crc kubenswrapper[4790]: E0313 21:24:37.467271 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-utilities" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467280 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-utilities" Mar 13 21:24:37 crc kubenswrapper[4790]: E0313 21:24:37.467297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-content" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467305 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-content" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467498 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.468524 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.470108 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d6kfv"/"default-dockercfg-j27lm" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.470428 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6kfv"/"kube-root-ca.crt" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.470593 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6kfv"/"openshift-service-ca.crt" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.478903 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.544824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.544911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.646688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.646781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.647557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.675327 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.790132 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:38 crc kubenswrapper[4790]: I0313 21:24:38.246362 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:24:38 crc kubenswrapper[4790]: I0313 21:24:38.796278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerStarted","Data":"bdf221b7afb6aaa5a6f44570f4a6373a7ad59c372a30ae5b7c3598adb658e3ef"} Mar 13 21:24:46 crc kubenswrapper[4790]: I0313 21:24:46.903850 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerStarted","Data":"f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb"} Mar 13 21:24:46 crc kubenswrapper[4790]: I0313 21:24:46.904414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerStarted","Data":"3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8"} Mar 13 21:24:46 crc kubenswrapper[4790]: I0313 21:24:46.923519 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" podStartSLOduration=2.442272951 podStartE2EDuration="9.923504477s" podCreationTimestamp="2026-03-13 21:24:37 +0000 UTC" firstStartedPulling="2026-03-13 21:24:38.255899621 +0000 UTC m=+3409.277015512" lastFinishedPulling="2026-03-13 21:24:45.737131147 +0000 UTC m=+3416.758247038" observedRunningTime="2026-03-13 21:24:46.922423867 +0000 UTC m=+3417.943539758" watchObservedRunningTime="2026-03-13 21:24:46.923504477 +0000 UTC m=+3417.944620368" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.566946 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-w42ql"] Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.569006 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.699264 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.700602 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.802597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.802656 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.803055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.821110 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.892835 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:50 crc kubenswrapper[4790]: I0313 21:24:50.941521 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" event={"ID":"cedbf666-53d4-4452-849e-8714a7e57e3c","Type":"ContainerStarted","Data":"fc567be766ecf269be33edfb55f52a19879095d8f295cb35981738d3c130d459"} Mar 13 21:25:04 crc kubenswrapper[4790]: I0313 21:25:04.080333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" event={"ID":"cedbf666-53d4-4452-849e-8714a7e57e3c","Type":"ContainerStarted","Data":"18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0"} Mar 13 21:25:04 crc kubenswrapper[4790]: I0313 21:25:04.097530 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" podStartSLOduration=1.446822304 podStartE2EDuration="15.097508356s" podCreationTimestamp="2026-03-13 21:24:49 +0000 UTC" firstStartedPulling="2026-03-13 21:24:49.932539536 +0000 UTC m=+3420.953655427" lastFinishedPulling="2026-03-13 21:25:03.583225588 +0000 UTC m=+3434.604341479" observedRunningTime="2026-03-13 21:25:04.094661918 +0000 UTC m=+3435.115777809" watchObservedRunningTime="2026-03-13 21:25:04.097508356 +0000 UTC m=+3435.118624257" Mar 13 21:25:14 crc kubenswrapper[4790]: I0313 21:25:14.016206 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:25:14 crc kubenswrapper[4790]: I0313 21:25:14.016823 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:25:42 crc kubenswrapper[4790]: I0313 21:25:42.419832 4790 generic.go:334] "Generic (PLEG): container finished" podID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerID="18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0" exitCode=0 Mar 13 21:25:42 crc kubenswrapper[4790]: I0313 21:25:42.419909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" event={"ID":"cedbf666-53d4-4452-849e-8714a7e57e3c","Type":"ContainerDied","Data":"18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0"} Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.518818 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.550660 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-w42ql"] Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.560241 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-w42ql"] Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.668726 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"cedbf666-53d4-4452-849e-8714a7e57e3c\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.668957 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"cedbf666-53d4-4452-849e-8714a7e57e3c\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.670523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host" (OuterVolumeSpecName: "host") pod "cedbf666-53d4-4452-849e-8714a7e57e3c" (UID: "cedbf666-53d4-4452-849e-8714a7e57e3c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.674762 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk" (OuterVolumeSpecName: "kube-api-access-5vgjk") pod "cedbf666-53d4-4452-849e-8714a7e57e3c" (UID: "cedbf666-53d4-4452-849e-8714a7e57e3c"). InnerVolumeSpecName "kube-api-access-5vgjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.772195 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.772222 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.016001 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.016053 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.436534 4790 scope.go:117] "RemoveContainer" containerID="18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.436576 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.707355 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-22ftk"] Mar 13 21:25:44 crc kubenswrapper[4790]: E0313 21:25:44.708035 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerName="container-00" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.708049 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerName="container-00" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.708220 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerName="container-00" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.708826 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.793257 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.793513 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.895545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.895611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.895817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.912361 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.027124 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.447328 4790 generic.go:334] "Generic (PLEG): container finished" podID="f364e64b-c894-4978-8625-4f4680ad09f1" containerID="19ec3b81cfc93adcffb8135210e1ea8d379fb945e3eddc6ee978b60b4ce52405" exitCode=0 Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.447527 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" event={"ID":"f364e64b-c894-4978-8625-4f4680ad09f1","Type":"ContainerDied","Data":"19ec3b81cfc93adcffb8135210e1ea8d379fb945e3eddc6ee978b60b4ce52405"} Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.447687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" event={"ID":"f364e64b-c894-4978-8625-4f4680ad09f1","Type":"ContainerStarted","Data":"360e4c0b8317039594325f893b5c1fb108d16716b89b77acd4c50141cbd8cdc9"} Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.669724 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" path="/var/lib/kubelet/pods/cedbf666-53d4-4452-849e-8714a7e57e3c/volumes" Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.877676 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-22ftk"] Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.889055 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-22ftk"] Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.556918 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625232 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"f364e64b-c894-4978-8625-4f4680ad09f1\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625302 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"f364e64b-c894-4978-8625-4f4680ad09f1\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625635 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host" (OuterVolumeSpecName: "host") pod "f364e64b-c894-4978-8625-4f4680ad09f1" (UID: "f364e64b-c894-4978-8625-4f4680ad09f1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625985 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.630638 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h" (OuterVolumeSpecName: "kube-api-access-4g85h") pod "f364e64b-c894-4978-8625-4f4680ad09f1" (UID: "f364e64b-c894-4978-8625-4f4680ad09f1"). InnerVolumeSpecName "kube-api-access-4g85h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.727599 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.051551 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-8tm75"] Mar 13 21:25:47 crc kubenswrapper[4790]: E0313 21:25:47.053053 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" containerName="container-00" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.053099 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" containerName="container-00" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.053283 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" containerName="container-00" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.053961 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.134747 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.135199 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.236609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.236721 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.236957 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.256338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.370495 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.466622 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="360e4c0b8317039594325f893b5c1fb108d16716b89b77acd4c50141cbd8cdc9" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.466664 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.467958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" event={"ID":"6ea26912-deee-4ff3-ad55-a6d703f99865","Type":"ContainerStarted","Data":"40c1c24f22a918346bd6c67a469263cd47d726ef2ba65c9cee278982982158da"} Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.696107 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" path="/var/lib/kubelet/pods/f364e64b-c894-4978-8625-4f4680ad09f1/volumes" Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.477517 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerID="4dfa4dd75f80f12e10343292ad55bf7e12db50998c94a2d0043e7456c83d4512" exitCode=0 Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.477583 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" event={"ID":"6ea26912-deee-4ff3-ad55-a6d703f99865","Type":"ContainerDied","Data":"4dfa4dd75f80f12e10343292ad55bf7e12db50998c94a2d0043e7456c83d4512"} Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.515362 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-8tm75"] Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.525494 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-8tm75"] Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.604018 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.686125 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"6ea26912-deee-4ff3-ad55-a6d703f99865\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.686244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"6ea26912-deee-4ff3-ad55-a6d703f99865\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.686570 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host" (OuterVolumeSpecName: "host") pod "6ea26912-deee-4ff3-ad55-a6d703f99865" (UID: "6ea26912-deee-4ff3-ad55-a6d703f99865"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.687113 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.693856 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl" (OuterVolumeSpecName: "kube-api-access-z6qbl") pod "6ea26912-deee-4ff3-ad55-a6d703f99865" (UID: "6ea26912-deee-4ff3-ad55-a6d703f99865"). InnerVolumeSpecName "kube-api-access-z6qbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.789530 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:50 crc kubenswrapper[4790]: I0313 21:25:50.493799 4790 scope.go:117] "RemoveContainer" containerID="4dfa4dd75f80f12e10343292ad55bf7e12db50998c94a2d0043e7456c83d4512" Mar 13 21:25:50 crc kubenswrapper[4790]: I0313 21:25:50.493826 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:50 crc kubenswrapper[4790]: E0313 21:25:50.591886 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea26912_deee_4ff3_ad55_a6d703f99865.slice/crio-40c1c24f22a918346bd6c67a469263cd47d726ef2ba65c9cee278982982158da\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea26912_deee_4ff3_ad55_a6d703f99865.slice\": RecentStats: unable to find data in memory cache]" Mar 13 21:25:51 crc kubenswrapper[4790]: I0313 21:25:51.671795 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" path="/var/lib/kubelet/pods/6ea26912-deee-4ff3-ad55-a6d703f99865/volumes" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.151819 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:26:00 crc kubenswrapper[4790]: E0313 21:26:00.152854 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerName="container-00" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.152871 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerName="container-00" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.153079 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerName="container-00" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.153845 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.156540 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.156566 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.157870 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.164880 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.289160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"auto-csr-approver-29557286-j2hgs\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.390553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"auto-csr-approver-29557286-j2hgs\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.424907 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"auto-csr-approver-29557286-j2hgs\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.477415 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.931308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:26:01 crc kubenswrapper[4790]: I0313 21:26:01.659210 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" event={"ID":"93d597e8-cf42-4f34-a6c1-ffe9416a562b","Type":"ContainerStarted","Data":"ce940e262f90be5fa54ef8e07bd777275cab1626fc349844b9170a8412018fb7"} Mar 13 21:26:02 crc kubenswrapper[4790]: I0313 21:26:02.669607 4790 generic.go:334] "Generic (PLEG): container finished" podID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerID="d255e7ab1f308e1f21736aa4f57843906cd9283c436db74b17e9a79b7ff4810a" exitCode=0 Mar 13 21:26:02 crc kubenswrapper[4790]: I0313 21:26:02.669733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" event={"ID":"93d597e8-cf42-4f34-a6c1-ffe9416a562b","Type":"ContainerDied","Data":"d255e7ab1f308e1f21736aa4f57843906cd9283c436db74b17e9a79b7ff4810a"} Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.098968 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.168531 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.175355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs" (OuterVolumeSpecName: "kube-api-access-f7tqs") pod "93d597e8-cf42-4f34-a6c1-ffe9416a562b" (UID: "93d597e8-cf42-4f34-a6c1-ffe9416a562b"). InnerVolumeSpecName "kube-api-access-f7tqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.271159 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") on node \"crc\" DevicePath \"\"" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.333677 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c6887dbdb-wnl4x_dc5e5f2f-999a-4ae6-82f1-d5942a570a3e/barbican-api/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.486650 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f469b5d-gs7bt_8a191811-ef81-4066-bcbb-0385c9258fc0/barbican-keystone-listener/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.534013 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c6887dbdb-wnl4x_dc5e5f2f-999a-4ae6-82f1-d5942a570a3e/barbican-api-log/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.550799 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f469b5d-gs7bt_8a191811-ef81-4066-bcbb-0385c9258fc0/barbican-keystone-listener-log/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.687283 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" event={"ID":"93d597e8-cf42-4f34-a6c1-ffe9416a562b","Type":"ContainerDied","Data":"ce940e262f90be5fa54ef8e07bd777275cab1626fc349844b9170a8412018fb7"} Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.687321 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.687327 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce940e262f90be5fa54ef8e07bd777275cab1626fc349844b9170a8412018fb7" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.776314 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d9ddc9bbc-tg88r_98f92730-30b3-4583-ab7c-258c0a0880a2/barbican-worker/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.799159 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d9ddc9bbc-tg88r_98f92730-30b3-4583-ab7c-258c0a0880a2/barbican-worker-log/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.997347 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n_5fc3181b-a2df-4d5c-afa1-057cef46dd95/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.077892 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/ceilometer-central-agent/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.133974 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/ceilometer-notification-agent/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.173340 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.184034 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.308338 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/proxy-httpd/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.350174 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/sg-core/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.379661 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c42a2a27-f7c5-463b-982a-4dafcac978ad/cinder-api/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.563526 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c42a2a27-f7c5-463b-982a-4dafcac978ad/cinder-api-log/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.622193 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5/cinder-scheduler/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.756806 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" path="/var/lib/kubelet/pods/c215866d-1e07-4033-8e8a-d7826692bc76/volumes" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.786109 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5/probe/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.947489 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vg564_c1609d29-96e5-43eb-a086-5587ca7c4f5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.063150 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk_f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.180532 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p5ml2_66175627-2b03-49c6-a7a1-de69f8851d9a/init/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.378133 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p5ml2_66175627-2b03-49c6-a7a1-de69f8851d9a/init/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.391587 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p5ml2_66175627-2b03-49c6-a7a1-de69f8851d9a/dnsmasq-dns/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.499837 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-cfp58_304addb4-f579-42f8-87d8-8e15b713aef2/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.626486 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8c1c1847-eb77-4170-8034-e58ba375ad84/glance-httpd/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.633054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8c1c1847-eb77-4170-8034-e58ba375ad84/glance-log/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.826900 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5b10e44-e0ce-4568-b33c-dd9855d61fd7/glance-httpd/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.844625 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5b10e44-e0ce-4568-b33c-dd9855d61fd7/glance-log/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.107090 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-686b857b8-6fghv_d0f5105d-51ea-4e5e-832f-8302188a943a/horizon/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.209953 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mznb5_77bc94c9-b530-4ea9-8c94-0d5a985fb930/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.367686 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-686b857b8-6fghv_d0f5105d-51ea-4e5e-832f-8302188a943a/horizon-log/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.392347 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mjj4b_04553c47-94a9-465f-a241-9188784794de/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.745717 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29557261-5pp9q_65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648/keystone-cron/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.787027 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c5788df58-llnz4_4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7/keystone-api/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.960503 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2ae1ef11-086d-4d65-bfcb-987f3973fdc5/kube-state-metrics/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.984392 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx_c70cf667-ebdd-414d-be40-62d26209abcf/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:08 crc kubenswrapper[4790]: I0313 21:26:08.297672 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77f687ff4f-d7b7z_6c6f5d56-217d-441e-8771-503fd5e681fb/neutron-api/0.log" Mar 13 21:26:08 crc kubenswrapper[4790]: I0313 21:26:08.377111 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77f687ff4f-d7b7z_6c6f5d56-217d-441e-8771-503fd5e681fb/neutron-httpd/0.log" Mar 13 21:26:08 crc kubenswrapper[4790]: I0313 21:26:08.559877 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4_944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.136890 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4597d91c-0f9f-4e33-aaa7-b25e7076e13a/nova-api-log/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.450279 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_253ef3a1-1764-4120-a5f8-db908a0e7fd4/nova-cell0-conductor-conductor/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.499233 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b/nova-cell1-conductor-conductor/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.557977 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4597d91c-0f9f-4e33-aaa7-b25e7076e13a/nova-api-api/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.789395 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_20c0842a-c69a-4af0-aef0-ffec3f3560bc/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.890126 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lhxzv_7b947c94-305d-453d-b2f0-bcf3c84467b3/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.048821 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_00b43558-bdf4-45e4-b1bc-6e9b325e163b/nova-metadata-log/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.286339 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fa2face0-9349-4482-880a-b23cf41099b2/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.308585 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_01e86425-f126-4827-b727-e8c73d152aa6/nova-scheduler-scheduler/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.363251 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_00b43558-bdf4-45e4-b1bc-6e9b325e163b/nova-metadata-metadata/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.432055 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fa2face0-9349-4482-880a-b23cf41099b2/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.470482 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fa2face0-9349-4482-880a-b23cf41099b2/galera/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.582176 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fceb0829-5f0e-4e78-a803-61afc5aa4d60/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.785086 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fceb0829-5f0e-4e78-a803-61afc5aa4d60/galera/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.809523 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fceb0829-5f0e-4e78-a803-61afc5aa4d60/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.825446 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7f0237c2-5c72-4776-9226-67244abca8dd/openstackclient/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.030050 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nrv7g_dfb0e0ca-d164-4e22-9d3f-055a45a372d2/openstack-network-exporter/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.102604 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovsdb-server-init/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.341225 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovs-vswitchd/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.356949 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovsdb-server/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.361504 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovsdb-server-init/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.535802 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vspq5_c72ac557-7882-4120-b64a-4343639cc766/ovn-controller/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.623289 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7d9cq_fe27e2d5-7108-4d49-99bb-15208f36cff7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.843116 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_18e18c94-0ce6-4578-a224-384826512a34/ovn-northd/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.870884 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_18e18c94-0ce6-4578-a224-384826512a34/openstack-network-exporter/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.953235 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f5a24d7e-902f-4862-9c6b-8317f8fb3f29/openstack-network-exporter/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.112133 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f5a24d7e-902f-4862-9c6b-8317f8fb3f29/ovsdbserver-nb/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.114762 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ba4867dc-70fb-4533-a075-31fc03f7ef33/openstack-network-exporter/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.147228 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ba4867dc-70fb-4533-a075-31fc03f7ef33/ovsdbserver-sb/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.362485 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-854ddc4bd-b4ws7_b14c1738-5e9e-4810-b926-5b05af9ec22d/placement-api/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.422847 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-854ddc4bd-b4ws7_b14c1738-5e9e-4810-b926-5b05af9ec22d/placement-log/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.532952 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9/setup-container/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.729235 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9/setup-container/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.844170 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9/rabbitmq/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.881850 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72ed8a4f-a46a-4e41-9335-f10dc6338627/setup-container/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.002085 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72ed8a4f-a46a-4e41-9335-f10dc6338627/setup-container/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.090875 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt_7cb0d614-f5d9-4862-8059-ad323eec6c59/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.100080 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72ed8a4f-a46a-4e41-9335-f10dc6338627/rabbitmq/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.298556 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zgxl2_6383acac-fad0-45d2-8263-da2ceb0b9e83/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.345274 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h_37459d15-1599-492b-8710-7723829a096d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.509625 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rhtx8_d19bd67c-441b-4813-8cc3-07c8cf446e42/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.627457 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-pdxrj_ee77aab6-b3c2-4925-a715-428a4c5e5bd9/ssh-known-hosts-edpm-deployment/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.848325 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-798495789f-5fvw5_7d498924-f84f-48aa-b971-b58cbea48295/proxy-server/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.909368 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-798495789f-5fvw5_7d498924-f84f-48aa-b971-b58cbea48295/proxy-httpd/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.980124 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dv686_b4ea3695-dddc-48fe-bdb6-eb0450c697c4/swift-ring-rebalance/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.015209 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.015275 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.015327 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.016219 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.016292 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7" gracePeriod=600 Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.119542 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-reaper/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.206950 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-auditor/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.221281 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-replicator/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.294452 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-server/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.392136 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-auditor/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.419020 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-replicator/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.442436 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-server/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.573164 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-auditor/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.599972 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-updater/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.613821 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-expirer/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.695918 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-replicator/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.773821 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-server/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.858627 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/rsync/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887405 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7" exitCode=0 Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7"} Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887498 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80"} Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887520 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.935960 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-updater/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.960972 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/swift-recon-cron/0.log" Mar 13 21:26:15 crc kubenswrapper[4790]: I0313 21:26:15.167539 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr_71b17a66-faf5-4379-ace9-a4fff12cac5b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:15 crc kubenswrapper[4790]: I0313 21:26:15.212624 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_50c1f858-4451-4e6e-9e80-6e37528305a2/tempest-tests-tempest-tests-runner/0.log" Mar 13 21:26:15 crc kubenswrapper[4790]: I0313 21:26:15.366971 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-n54ff_20beb5d9-49e6-47c7-a3ad-107ff79e56fd/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:19 crc kubenswrapper[4790]: I0313 21:26:19.545035 4790 scope.go:117] "RemoveContainer" containerID="1dfb1a39dcbf9770c39e6abee624c19e7caa14a0b69762f480ec12e76586b37f" Mar 13 21:26:23 crc kubenswrapper[4790]: I0313 21:26:23.724774 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3980f8da-ddaa-4634-8c09-1a71ae19c58f/memcached/0.log" Mar 13 21:26:38 crc kubenswrapper[4790]: I0313 21:26:38.856401 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/util/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.083791 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/util/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.085455 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/pull/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.122597 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/pull/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.332914 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/util/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.340981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/extract/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.348929 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/pull/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.658099 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-s8p67_bdbe5269-1150-4269-bc28-1d719f1b77b6/manager/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.799075 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-h7rc9_46fb44a5-f567-4f58-80b1-dd70694f9339/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.039220 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-tzx96_e154cc44-2769-4bfe-b8ef-3f6c56f08f74/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.053289 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-q5nj7_a7488d00-50bc-4ce8-ae0a-8d3ff807c0da/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.256671 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-nzdzx_460b6997-f558-4e5f-9e15-aa33fece4f4b/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.480904 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-5plwh_dd8df218-c492-4e48-93a9-f5f2dbf7fc00/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.575657 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-wfltj_2747d064-d45f-4a4e-87c2-d2c9f82eac10/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.757613 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-jrr7h_7caf7136-8a46-410b-8a32-72ab19e8baca/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.806298 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-5vcsg_77f24ce6-bc52-4831-902c-255983a8f911/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.979598 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-hlk9s_b5a018c4-3e3a-4f77-a272-20c94a5b9c7a/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.024991 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v_5befe4e4-4574-42ac-90ce-ac67c1e33eee/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.182993 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-dxntp_499aa973-6f5e-4229-9282-52c4fbf0625f/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.333860 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-b8lpj_386f7e46-c2e3-4eae-aa82-05075883c889/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.395059 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-tbbfl_403c2990-8871-47da-abd8-8c9fc5753d54/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.529466 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn_5622f52e-2e94-41ca-a9d2-a0c833895937/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.694472 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5c46d6fb64-bj72t_87b8083b-23ab-4733-a7ac-85bf1e565551/operator/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.866194 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-58vcj_db35ffd8-ac53-48ad-8035-53066c9df48b/registry-server/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.180365 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-c9lbv_b36f993b-25cd-4f12-bf48-77bf6f4cf26b/manager/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.213823 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-hwdv8_b1273818-139a-4213-b23c-609a7305c92f/manager/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.512954 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xvrl9_22e6d110-bd87-4d28-851d-307b4223ee8f/operator/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.689621 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-ppzzz_0244e4ae-2ccd-482a-b490-58a8e46ab53d/manager/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.781513 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6854b8b9d9-f8l4s_2032df10-91a5-4a88-9705-c355f50a5024/manager/0.log" Mar 13 21:26:43 crc kubenswrapper[4790]: I0313 21:26:43.027547 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-cfb9g_a36ba835-deb4-41f5-9b6a-57d1e577c8b1/manager/0.log" Mar 13 21:26:43 crc kubenswrapper[4790]: I0313 21:26:43.028614 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5698bc49b8-xpzcd_bf0c2c50-711c-4fbd-8c15-64bf6fc3572b/manager/0.log" Mar 13 21:26:43 crc kubenswrapper[4790]: I0313 21:26:43.151012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-5689f_47bdfeda-c97a-40b5-82f8-1008ba20e75b/manager/0.log" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.206630 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:26:51 crc kubenswrapper[4790]: E0313 21:26:51.207738 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerName="oc" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.207753 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerName="oc" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.207958 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerName="oc" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.209489 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.229150 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.290971 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.291150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.291320 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.393342 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.393473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.393540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.394435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.394712 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.415655 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.526202 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:52 crc kubenswrapper[4790]: I0313 21:26:52.091487 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:26:52 crc kubenswrapper[4790]: I0313 21:26:52.199654 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerStarted","Data":"fd1ca7f5767685d5f0304ca00ae3b1facfd52b9570d737e7f9d5888cc79d70ba"} Mar 13 21:26:53 crc kubenswrapper[4790]: I0313 21:26:53.210174 4790 generic.go:334] "Generic (PLEG): container finished" podID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" exitCode=0 Mar 13 21:26:53 crc kubenswrapper[4790]: I0313 21:26:53.210261 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412"} Mar 13 21:26:54 crc kubenswrapper[4790]: I0313 21:26:54.224725 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerStarted","Data":"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c"} Mar 13 21:26:55 crc kubenswrapper[4790]: I0313 21:26:55.233596 4790 generic.go:334] "Generic (PLEG): container finished" podID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" exitCode=0 Mar 13 21:26:55 crc kubenswrapper[4790]: I0313 21:26:55.233827 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c"} Mar 13 21:26:56 crc kubenswrapper[4790]: I0313 21:26:56.263200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerStarted","Data":"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b"} Mar 13 21:26:56 crc kubenswrapper[4790]: I0313 21:26:56.292497 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4xklq" podStartSLOduration=2.740686396 podStartE2EDuration="5.292473273s" podCreationTimestamp="2026-03-13 21:26:51 +0000 UTC" firstStartedPulling="2026-03-13 21:26:53.212031572 +0000 UTC m=+3544.233147463" lastFinishedPulling="2026-03-13 21:26:55.763818449 +0000 UTC m=+3546.784934340" observedRunningTime="2026-03-13 21:26:56.286931512 +0000 UTC m=+3547.308047413" watchObservedRunningTime="2026-03-13 21:26:56.292473273 +0000 UTC m=+3547.313589164" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.526355 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.526893 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.580702 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.947794 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qr47g_b8f95d7e-96c6-475c-8bef-d72937cc36b4/control-plane-machine-set-operator/0.log" Mar 13 21:27:02 crc kubenswrapper[4790]: I0313 21:27:02.158552 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jfdgz_a626166a-5d74-4dd9-b838-746731bfedef/machine-api-operator/0.log" Mar 13 21:27:02 crc kubenswrapper[4790]: I0313 21:27:02.177236 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jfdgz_a626166a-5d74-4dd9-b838-746731bfedef/kube-rbac-proxy/0.log" Mar 13 21:27:02 crc kubenswrapper[4790]: I0313 21:27:02.367443 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:03 crc kubenswrapper[4790]: I0313 21:27:03.369583 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.321871 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4xklq" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" containerID="cri-o://a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" gracePeriod=2 Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.789046 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.939489 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.939766 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.939862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.941108 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities" (OuterVolumeSpecName: "utilities") pod "a2e7f774-1873-40b8-a8d8-bf1e8677b87b" (UID: "a2e7f774-1873-40b8-a8d8-bf1e8677b87b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.941738 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.955289 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj" (OuterVolumeSpecName: "kube-api-access-jssdj") pod "a2e7f774-1873-40b8-a8d8-bf1e8677b87b" (UID: "a2e7f774-1873-40b8-a8d8-bf1e8677b87b"). InnerVolumeSpecName "kube-api-access-jssdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.985149 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2e7f774-1873-40b8-a8d8-bf1e8677b87b" (UID: "a2e7f774-1873-40b8-a8d8-bf1e8677b87b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.043334 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.043365 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346406 4790 generic.go:334] "Generic (PLEG): container finished" podID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" exitCode=0 Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b"} Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346478 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"fd1ca7f5767685d5f0304ca00ae3b1facfd52b9570d737e7f9d5888cc79d70ba"} Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346500 4790 scope.go:117] "RemoveContainer" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346643 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.375289 4790 scope.go:117] "RemoveContainer" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.395710 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.402012 4790 scope.go:117] "RemoveContainer" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.414936 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.442680 4790 scope.go:117] "RemoveContainer" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" Mar 13 21:27:05 crc kubenswrapper[4790]: E0313 21:27:05.443144 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b\": container with ID starting with a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b not found: ID does not exist" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443186 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b"} err="failed to get container status \"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b\": rpc error: code = NotFound desc = could not find container \"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b\": container with ID starting with a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b not found: ID does not exist" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443210 4790 scope.go:117] "RemoveContainer" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" Mar 13 21:27:05 crc kubenswrapper[4790]: E0313 21:27:05.443498 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c\": container with ID starting with 16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c not found: ID does not exist" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443543 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c"} err="failed to get container status \"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c\": rpc error: code = NotFound desc = could not find container \"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c\": container with ID starting with 16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c not found: ID does not exist" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443570 4790 scope.go:117] "RemoveContainer" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" Mar 13 21:27:05 crc kubenswrapper[4790]: E0313 21:27:05.443842 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412\": container with ID starting with f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412 not found: ID does not exist" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443873 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412"} err="failed to get container status \"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412\": rpc error: code = NotFound desc = could not find container \"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412\": container with ID starting with f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412 not found: ID does not exist" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.672024 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" path="/var/lib/kubelet/pods/a2e7f774-1873-40b8-a8d8-bf1e8677b87b/volumes" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.776169 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:07 crc kubenswrapper[4790]: E0313 21:27:07.776986 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.776997 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" Mar 13 21:27:07 crc kubenswrapper[4790]: E0313 21:27:07.777017 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-content" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.777023 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-content" Mar 13 21:27:07 crc kubenswrapper[4790]: E0313 21:27:07.777039 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-utilities" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.777045 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-utilities" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.777260 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.778665 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.794590 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.796040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.796107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.796133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897191 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897318 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897643 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.936743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:08 crc kubenswrapper[4790]: I0313 21:27:08.096567 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:08 crc kubenswrapper[4790]: I0313 21:27:08.682753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:09 crc kubenswrapper[4790]: I0313 21:27:09.383029 4790 generic.go:334] "Generic (PLEG): container finished" podID="986a26ad-d48a-428b-99bb-7684ea902e87" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" exitCode=0 Mar 13 21:27:09 crc kubenswrapper[4790]: I0313 21:27:09.383142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6"} Mar 13 21:27:09 crc kubenswrapper[4790]: I0313 21:27:09.383358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerStarted","Data":"082cf7f16d850b1cc1a9caea3f6f0a73b6f9737403c81ae02f070b30089db739"} Mar 13 21:27:10 crc kubenswrapper[4790]: I0313 21:27:10.396938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerStarted","Data":"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644"} Mar 13 21:27:11 crc kubenswrapper[4790]: I0313 21:27:11.407281 4790 generic.go:334] "Generic (PLEG): container finished" podID="986a26ad-d48a-428b-99bb-7684ea902e87" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" exitCode=0 Mar 13 21:27:11 crc kubenswrapper[4790]: I0313 21:27:11.407329 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644"} Mar 13 21:27:12 crc kubenswrapper[4790]: I0313 21:27:12.417217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerStarted","Data":"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035"} Mar 13 21:27:12 crc kubenswrapper[4790]: I0313 21:27:12.437167 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mlctx" podStartSLOduration=3.040953993 podStartE2EDuration="5.437148686s" podCreationTimestamp="2026-03-13 21:27:07 +0000 UTC" firstStartedPulling="2026-03-13 21:27:09.385359809 +0000 UTC m=+3560.406475700" lastFinishedPulling="2026-03-13 21:27:11.781554502 +0000 UTC m=+3562.802670393" observedRunningTime="2026-03-13 21:27:12.433712892 +0000 UTC m=+3563.454828773" watchObservedRunningTime="2026-03-13 21:27:12.437148686 +0000 UTC m=+3563.458264577" Mar 13 21:27:16 crc kubenswrapper[4790]: I0313 21:27:16.077799 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fgq7z_c77372fb-0649-4c32-be4f-34c3dd515246/cert-manager-controller/0.log" Mar 13 21:27:16 crc kubenswrapper[4790]: I0313 21:27:16.236319 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-vfjwg_f58ec868-a42c-463c-b65f-bf118fae6518/cert-manager-cainjector/0.log" Mar 13 21:27:16 crc kubenswrapper[4790]: I0313 21:27:16.308470 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-p4h8t_1430c143-e235-49e5-a141-78b9e3297b70/cert-manager-webhook/0.log" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.096997 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.097308 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.141905 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.512500 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.578965 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.482998 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mlctx" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" containerID="cri-o://428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" gracePeriod=2 Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.933597 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.967720 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"986a26ad-d48a-428b-99bb-7684ea902e87\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.967988 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"986a26ad-d48a-428b-99bb-7684ea902e87\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.968017 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"986a26ad-d48a-428b-99bb-7684ea902e87\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.968891 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities" (OuterVolumeSpecName: "utilities") pod "986a26ad-d48a-428b-99bb-7684ea902e87" (UID: "986a26ad-d48a-428b-99bb-7684ea902e87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.974055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn" (OuterVolumeSpecName: "kube-api-access-gftjn") pod "986a26ad-d48a-428b-99bb-7684ea902e87" (UID: "986a26ad-d48a-428b-99bb-7684ea902e87"). InnerVolumeSpecName "kube-api-access-gftjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.036996 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "986a26ad-d48a-428b-99bb-7684ea902e87" (UID: "986a26ad-d48a-428b-99bb-7684ea902e87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.070123 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.070408 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.070519 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496317 4790 generic.go:334] "Generic (PLEG): container finished" podID="986a26ad-d48a-428b-99bb-7684ea902e87" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" exitCode=0 Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496400 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035"} Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"082cf7f16d850b1cc1a9caea3f6f0a73b6f9737403c81ae02f070b30089db739"} Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496749 4790 scope.go:117] "RemoveContainer" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.534840 4790 scope.go:117] "RemoveContainer" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.538906 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.550246 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.577477 4790 scope.go:117] "RemoveContainer" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598052 4790 scope.go:117] "RemoveContainer" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" Mar 13 21:27:21 crc kubenswrapper[4790]: E0313 21:27:21.598385 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035\": container with ID starting with 428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035 not found: ID does not exist" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598430 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035"} err="failed to get container status \"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035\": rpc error: code = NotFound desc = could not find container \"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035\": container with ID starting with 428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035 not found: ID does not exist" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598454 4790 scope.go:117] "RemoveContainer" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" Mar 13 21:27:21 crc kubenswrapper[4790]: E0313 21:27:21.598858 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644\": container with ID starting with 3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644 not found: ID does not exist" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598887 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644"} err="failed to get container status \"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644\": rpc error: code = NotFound desc = could not find container \"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644\": container with ID starting with 3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644 not found: ID does not exist" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598907 4790 scope.go:117] "RemoveContainer" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" Mar 13 21:27:21 crc kubenswrapper[4790]: E0313 21:27:21.599162 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6\": container with ID starting with 8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6 not found: ID does not exist" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.599186 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6"} err="failed to get container status \"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6\": rpc error: code = NotFound desc = could not find container \"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6\": container with ID starting with 8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6 not found: ID does not exist" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.669868 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" path="/var/lib/kubelet/pods/986a26ad-d48a-428b-99bb-7684ea902e87/volumes" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.425519 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-k8mcs_c7ef6baa-3c87-44a8-91d2-bcfbc0696396/nmstate-console-plugin/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.572872 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b2697_d5c9a572-635b-4ecc-a2a4-c7e459d6d510/nmstate-handler/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.631727 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wvv95_4295503b-996b-4a20-844b-07a90de225a6/kube-rbac-proxy/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.720408 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wvv95_4295503b-996b-4a20-844b-07a90de225a6/nmstate-metrics/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.832012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-4lvtv_4d5f9755-21a7-482e-8788-85ed86738b40/nmstate-operator/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.910744 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-qld4w_e1a3b709-858c-4bca-b52b-c96dc23d9149/nmstate-webhook/0.log" Mar 13 21:27:55 crc kubenswrapper[4790]: I0313 21:27:55.794747 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-czl9k_d5ef8654-e56f-454b-9fae-0753a30dab0f/kube-rbac-proxy/0.log" Mar 13 21:27:55 crc kubenswrapper[4790]: I0313 21:27:55.867054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-czl9k_d5ef8654-e56f-454b-9fae-0753a30dab0f/controller/0.log" Mar 13 21:27:55 crc kubenswrapper[4790]: I0313 21:27:55.981033 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.197583 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.229873 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.245493 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.252252 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.461213 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.487819 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.498684 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.532949 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.644273 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.653958 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.678993 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.729579 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/controller/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.827322 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/frr-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.865349 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/kube-rbac-proxy/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.949055 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/kube-rbac-proxy-frr/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.035617 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/reloader/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.173223 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-8ckr8_472cc73a-53fe-4d7c-aec8-b2154023ba90/frr-k8s-webhook-server/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.322932 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6c885c8d8c-fcv54_da23093d-500f-43f4-805a-b4a252e40940/manager/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.450546 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-76c9b767d4-c6mq2_783be831-b522-42a0-9cbe-f234ed3a027c/webhook-server/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.681751 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5tk2m_a3729738-ead5-47e0-95de-04dc39fb0516/kube-rbac-proxy/0.log" Mar 13 21:27:58 crc kubenswrapper[4790]: I0313 21:27:58.293411 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5tk2m_a3729738-ead5-47e0-95de-04dc39fb0516/speaker/0.log" Mar 13 21:27:58 crc kubenswrapper[4790]: I0313 21:27:58.516228 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/frr/0.log" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.143726 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:28:00 crc kubenswrapper[4790]: E0313 21:28:00.144473 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-content" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144490 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-content" Mar 13 21:28:00 crc kubenswrapper[4790]: E0313 21:28:00.144518 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-utilities" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144525 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-utilities" Mar 13 21:28:00 crc kubenswrapper[4790]: E0313 21:28:00.144549 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144557 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144805 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.145706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.147838 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.150765 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.150992 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.153202 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.281023 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"auto-csr-approver-29557288-qmdk7\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.382895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"auto-csr-approver-29557288-qmdk7\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.428662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"auto-csr-approver-29557288-qmdk7\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.464992 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.914500 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:28:01 crc kubenswrapper[4790]: I0313 21:28:01.863224 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" event={"ID":"4d3de3f1-0534-4203-b465-d512d6c80287","Type":"ContainerStarted","Data":"083d7cbcd12c287e727aed7a01b7ce6bd4531ca855019f4fdabe4a1671c16c15"} Mar 13 21:28:02 crc kubenswrapper[4790]: I0313 21:28:02.874160 4790 generic.go:334] "Generic (PLEG): container finished" podID="4d3de3f1-0534-4203-b465-d512d6c80287" containerID="c265de87623abb9a96ed933e22a3276547bc13888411d097434621497cc49ed1" exitCode=0 Mar 13 21:28:02 crc kubenswrapper[4790]: I0313 21:28:02.874213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" event={"ID":"4d3de3f1-0534-4203-b465-d512d6c80287","Type":"ContainerDied","Data":"c265de87623abb9a96ed933e22a3276547bc13888411d097434621497cc49ed1"} Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.191303 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.358115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"4d3de3f1-0534-4203-b465-d512d6c80287\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.366054 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6" (OuterVolumeSpecName: "kube-api-access-9pqf6") pod "4d3de3f1-0534-4203-b465-d512d6c80287" (UID: "4d3de3f1-0534-4203-b465-d512d6c80287"). InnerVolumeSpecName "kube-api-access-9pqf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.460485 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") on node \"crc\" DevicePath \"\"" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.900041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" event={"ID":"4d3de3f1-0534-4203-b465-d512d6c80287","Type":"ContainerDied","Data":"083d7cbcd12c287e727aed7a01b7ce6bd4531ca855019f4fdabe4a1671c16c15"} Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.900082 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="083d7cbcd12c287e727aed7a01b7ce6bd4531ca855019f4fdabe4a1671c16c15" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.900140 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:05 crc kubenswrapper[4790]: I0313 21:28:05.264673 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:28:05 crc kubenswrapper[4790]: I0313 21:28:05.277103 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:28:05 crc kubenswrapper[4790]: I0313 21:28:05.669519 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" path="/var/lib/kubelet/pods/e4a5c228-86a3-4945-8d95-44db739406d7/volumes" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.406864 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/util/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.625446 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/pull/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.636763 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/pull/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.655025 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/util/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.789002 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/util/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.811742 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/pull/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.818448 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/extract/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.981282 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/util/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.139106 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/util/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.142614 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/pull/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.165828 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/pull/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.302218 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/util/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.313006 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/extract/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.326191 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/pull/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.467352 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-utilities/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.636988 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-content/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.639003 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-utilities/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.652188 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-content/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.858859 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-utilities/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.898336 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.117580 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-utilities/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.354989 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-utilities/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.381109 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.424509 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.454503 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/registry-server/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.592054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-utilities/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.616545 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.822610 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n548b_97fe66e8-7366-4c61-b1db-4d98459834da/marketplace-operator/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.925933 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.010057 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/registry-server/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.099328 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.148876 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.174575 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.316130 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.359795 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.534901 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/registry-server/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.538915 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.776780 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.798544 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.841222 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-content/0.log" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.008512 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-utilities/0.log" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.015303 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.015523 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.079873 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-content/0.log" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.634706 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/registry-server/0.log" Mar 13 21:28:19 crc kubenswrapper[4790]: I0313 21:28:19.754903 4790 scope.go:117] "RemoveContainer" containerID="5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0" Mar 13 21:28:44 crc kubenswrapper[4790]: I0313 21:28:44.015988 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:28:44 crc kubenswrapper[4790]: I0313 21:28:44.016581 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.015772 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.016424 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.016490 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.017543 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.017629 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" gracePeriod=600 Mar 13 21:29:14 crc kubenswrapper[4790]: E0313 21:29:14.136536 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476119 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" exitCode=0 Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476193 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80"} Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476258 4790 scope.go:117] "RemoveContainer" containerID="5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476940 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:14 crc kubenswrapper[4790]: E0313 21:29:14.477451 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:28 crc kubenswrapper[4790]: I0313 21:29:28.661800 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:28 crc kubenswrapper[4790]: E0313 21:29:28.662825 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:40 crc kubenswrapper[4790]: I0313 21:29:40.660414 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:40 crc kubenswrapper[4790]: E0313 21:29:40.661296 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:55 crc kubenswrapper[4790]: I0313 21:29:55.659925 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:55 crc kubenswrapper[4790]: E0313 21:29:55.662026 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.153751 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557290-2w2zk"] Mar 13 21:30:00 crc kubenswrapper[4790]: E0313 21:30:00.154838 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" containerName="oc" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.154857 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" containerName="oc" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.155149 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" containerName="oc" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.156041 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.158706 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.158714 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.166407 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.166654 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.167880 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.169670 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.175172 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.179848 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557290-2w2zk"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.193282 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.255281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.255645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"auto-csr-approver-29557290-2w2zk\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.255809 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.256163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.357857 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.357970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"auto-csr-approver-29557290-2w2zk\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.358037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.358145 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.360749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.364299 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.377437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"auto-csr-approver-29557290-2w2zk\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.388405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.481933 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.488833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.954915 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557290-2w2zk"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.958447 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:30:01 crc kubenswrapper[4790]: W0313 21:30:01.035865 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3891db6b_832a_4f78_9d91_2945136ac41d.slice/crio-7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10 WatchSource:0}: Error finding container 7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10: Status 404 returned error can't find the container with id 7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10 Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.052611 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d"] Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.920092 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerStarted","Data":"0a0cecd5c3e5fda8f03d676b699f878194e10eb22bb6b8ec985f0446cc2c1fa3"} Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.922628 4790 generic.go:334] "Generic (PLEG): container finished" podID="3891db6b-832a-4f78-9d91-2945136ac41d" containerID="8166015613c291721772d0950e6a082638300ea94be5c443065d9e3891ab62a3" exitCode=0 Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.922667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" event={"ID":"3891db6b-832a-4f78-9d91-2945136ac41d","Type":"ContainerDied","Data":"8166015613c291721772d0950e6a082638300ea94be5c443065d9e3891ab62a3"} Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.922687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" event={"ID":"3891db6b-832a-4f78-9d91-2945136ac41d","Type":"ContainerStarted","Data":"7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10"} Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.933796 4790 generic.go:334] "Generic (PLEG): container finished" podID="09855131-fcae-4c41-83c2-2874fd6e7068" containerID="3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8" exitCode=0 Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.933943 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerDied","Data":"3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8"} Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.934939 4790 scope.go:117] "RemoveContainer" containerID="3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8" Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.935909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerStarted","Data":"8545a35dacc04bd1f83edfa9e8f634e87f549dc58a48933c26d35edf437fcb49"} Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.973058 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" podStartSLOduration=1.398502391 podStartE2EDuration="2.973031851s" podCreationTimestamp="2026-03-13 21:30:00 +0000 UTC" firstStartedPulling="2026-03-13 21:30:00.958191893 +0000 UTC m=+3731.979307784" lastFinishedPulling="2026-03-13 21:30:02.532721363 +0000 UTC m=+3733.553837244" observedRunningTime="2026-03-13 21:30:02.967170891 +0000 UTC m=+3733.988286782" watchObservedRunningTime="2026-03-13 21:30:02.973031851 +0000 UTC m=+3733.994147742" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.263824 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.323326 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"3891db6b-832a-4f78-9d91-2945136ac41d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.323531 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"3891db6b-832a-4f78-9d91-2945136ac41d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.323574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"3891db6b-832a-4f78-9d91-2945136ac41d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.324711 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume" (OuterVolumeSpecName: "config-volume") pod "3891db6b-832a-4f78-9d91-2945136ac41d" (UID: "3891db6b-832a-4f78-9d91-2945136ac41d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.331630 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29" (OuterVolumeSpecName: "kube-api-access-szz29") pod "3891db6b-832a-4f78-9d91-2945136ac41d" (UID: "3891db6b-832a-4f78-9d91-2945136ac41d"). InnerVolumeSpecName "kube-api-access-szz29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.331745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3891db6b-832a-4f78-9d91-2945136ac41d" (UID: "3891db6b-832a-4f78-9d91-2945136ac41d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.397637 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/gather/0.log" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.425724 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.425758 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.425770 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.949152 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" event={"ID":"3891db6b-832a-4f78-9d91-2945136ac41d","Type":"ContainerDied","Data":"7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10"} Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.949509 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.949199 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.950934 4790 generic.go:334] "Generic (PLEG): container finished" podID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerID="8545a35dacc04bd1f83edfa9e8f634e87f549dc58a48933c26d35edf437fcb49" exitCode=0 Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.950981 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerDied","Data":"8545a35dacc04bd1f83edfa9e8f634e87f549dc58a48933c26d35edf437fcb49"} Mar 13 21:30:04 crc kubenswrapper[4790]: I0313 21:30:04.331576 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 21:30:04 crc kubenswrapper[4790]: I0313 21:30:04.339966 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.334493 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.479585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.487035 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4" (OuterVolumeSpecName: "kube-api-access-lzbs4") pod "c83c8e92-0c0b-4b43-8391-1c63b8755c64" (UID: "c83c8e92-0c0b-4b43-8391-1c63b8755c64"). InnerVolumeSpecName "kube-api-access-lzbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.582004 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.670997 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0001db4d-b91a-473e-bfff-794d8663885f" path="/var/lib/kubelet/pods/0001db4d-b91a-473e-bfff-794d8663885f/volumes" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.970618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerDied","Data":"0a0cecd5c3e5fda8f03d676b699f878194e10eb22bb6b8ec985f0446cc2c1fa3"} Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.970831 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a0cecd5c3e5fda8f03d676b699f878194e10eb22bb6b8ec985f0446cc2c1fa3" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.970704 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:06 crc kubenswrapper[4790]: I0313 21:30:06.384586 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:30:06 crc kubenswrapper[4790]: I0313 21:30:06.392367 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:30:07 crc kubenswrapper[4790]: I0313 21:30:07.671467 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" path="/var/lib/kubelet/pods/e71263f0-7309-4046-b71d-2ae38e13d27c/volumes" Mar 13 21:30:09 crc kubenswrapper[4790]: I0313 21:30:09.670694 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:09 crc kubenswrapper[4790]: E0313 21:30:09.671298 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:11 crc kubenswrapper[4790]: I0313 21:30:11.695591 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:30:11 crc kubenswrapper[4790]: I0313 21:30:11.695854 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" containerID="cri-o://f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb" gracePeriod=2 Mar 13 21:30:11 crc kubenswrapper[4790]: I0313 21:30:11.710276 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.022139 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/copy/0.log" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.023164 4790 generic.go:334] "Generic (PLEG): container finished" podID="09855131-fcae-4c41-83c2-2874fd6e7068" containerID="f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb" exitCode=143 Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.202643 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/copy/0.log" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.203191 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.315505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"09855131-fcae-4c41-83c2-2874fd6e7068\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.316052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"09855131-fcae-4c41-83c2-2874fd6e7068\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.335605 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf" (OuterVolumeSpecName: "kube-api-access-hnqnf") pod "09855131-fcae-4c41-83c2-2874fd6e7068" (UID: "09855131-fcae-4c41-83c2-2874fd6e7068"). InnerVolumeSpecName "kube-api-access-hnqnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.417697 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.474317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09855131-fcae-4c41-83c2-2874fd6e7068" (UID: "09855131-fcae-4c41-83c2-2874fd6e7068"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.519266 4790 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.033313 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/copy/0.log" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.034907 4790 scope.go:117] "RemoveContainer" containerID="f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.034932 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.062301 4790 scope.go:117] "RemoveContainer" containerID="3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.669397 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" path="/var/lib/kubelet/pods/09855131-fcae-4c41-83c2-2874fd6e7068/volumes" Mar 13 21:30:19 crc kubenswrapper[4790]: I0313 21:30:19.889257 4790 scope.go:117] "RemoveContainer" containerID="0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3" Mar 13 21:30:19 crc kubenswrapper[4790]: I0313 21:30:19.928757 4790 scope.go:117] "RemoveContainer" containerID="b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3" Mar 13 21:30:23 crc kubenswrapper[4790]: I0313 21:30:23.659571 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:23 crc kubenswrapper[4790]: E0313 21:30:23.660513 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:36 crc kubenswrapper[4790]: I0313 21:30:36.659366 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:36 crc kubenswrapper[4790]: E0313 21:30:36.662049 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:50 crc kubenswrapper[4790]: I0313 21:30:50.660783 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:50 crc kubenswrapper[4790]: E0313 21:30:50.662029 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:05 crc kubenswrapper[4790]: I0313 21:31:05.660445 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:05 crc kubenswrapper[4790]: E0313 21:31:05.661084 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:19 crc kubenswrapper[4790]: I0313 21:31:19.669905 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:19 crc kubenswrapper[4790]: E0313 21:31:19.670754 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:34 crc kubenswrapper[4790]: I0313 21:31:34.659911 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:34 crc kubenswrapper[4790]: E0313 21:31:34.660681 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:48 crc kubenswrapper[4790]: I0313 21:31:48.660084 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:48 crc kubenswrapper[4790]: E0313 21:31:48.660835 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:59 crc kubenswrapper[4790]: I0313 21:31:59.670292 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:59 crc kubenswrapper[4790]: E0313 21:31:59.672267 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.142687 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557292-j9s66"] Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143344 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143360 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143487 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerName="oc" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143504 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerName="oc" Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143533 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3891db6b-832a-4f78-9d91-2945136ac41d" containerName="collect-profiles" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143541 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3891db6b-832a-4f78-9d91-2945136ac41d" containerName="collect-profiles" Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143560 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="gather" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143571 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="gather" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143793 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="gather" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3891db6b-832a-4f78-9d91-2945136ac41d" containerName="collect-profiles" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143831 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerName="oc" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143848 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.144589 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.148669 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.148787 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.148855 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.154132 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557292-j9s66"] Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.216207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"auto-csr-approver-29557292-j9s66\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.318010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"auto-csr-approver-29557292-j9s66\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.342706 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"auto-csr-approver-29557292-j9s66\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.529859 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.958769 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557292-j9s66"] Mar 13 21:32:01 crc kubenswrapper[4790]: I0313 21:32:01.966707 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557292-j9s66" event={"ID":"6fa77812-592b-430d-b5db-15fab10e53e4","Type":"ContainerStarted","Data":"cd23023c8aff4e6acd5a4837074b7bbf1bbe449797445fb899e0f6e65d54b90f"} Mar 13 21:32:02 crc kubenswrapper[4790]: I0313 21:32:02.979627 4790 generic.go:334] "Generic (PLEG): container finished" podID="6fa77812-592b-430d-b5db-15fab10e53e4" containerID="d84e80b86d9cdab806e9b9bb5e0fdea7bd8634254927c159e0360e5f07881efd" exitCode=0 Mar 13 21:32:02 crc kubenswrapper[4790]: I0313 21:32:02.979757 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557292-j9s66" event={"ID":"6fa77812-592b-430d-b5db-15fab10e53e4","Type":"ContainerDied","Data":"d84e80b86d9cdab806e9b9bb5e0fdea7bd8634254927c159e0360e5f07881efd"} Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.338023 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.421494 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"6fa77812-592b-430d-b5db-15fab10e53e4\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.427491 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8" (OuterVolumeSpecName: "kube-api-access-ckbc8") pod "6fa77812-592b-430d-b5db-15fab10e53e4" (UID: "6fa77812-592b-430d-b5db-15fab10e53e4"). InnerVolumeSpecName "kube-api-access-ckbc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.524215 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") on node \"crc\" DevicePath \"\"" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.997768 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557292-j9s66" event={"ID":"6fa77812-592b-430d-b5db-15fab10e53e4","Type":"ContainerDied","Data":"cd23023c8aff4e6acd5a4837074b7bbf1bbe449797445fb899e0f6e65d54b90f"} Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.997805 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd23023c8aff4e6acd5a4837074b7bbf1bbe449797445fb899e0f6e65d54b90f" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.997827 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:05 crc kubenswrapper[4790]: I0313 21:32:05.410816 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:32:05 crc kubenswrapper[4790]: I0313 21:32:05.419183 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:32:05 crc kubenswrapper[4790]: I0313 21:32:05.670793 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" path="/var/lib/kubelet/pods/93d597e8-cf42-4f34-a6c1-ffe9416a562b/volumes" Mar 13 21:32:13 crc kubenswrapper[4790]: I0313 21:32:13.660537 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:13 crc kubenswrapper[4790]: E0313 21:32:13.661809 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:20 crc kubenswrapper[4790]: I0313 21:32:20.081606 4790 scope.go:117] "RemoveContainer" containerID="19ec3b81cfc93adcffb8135210e1ea8d379fb945e3eddc6ee978b60b4ce52405" Mar 13 21:32:20 crc kubenswrapper[4790]: I0313 21:32:20.105412 4790 scope.go:117] "RemoveContainer" containerID="d255e7ab1f308e1f21736aa4f57843906cd9283c436db74b17e9a79b7ff4810a" Mar 13 21:32:28 crc kubenswrapper[4790]: I0313 21:32:28.660230 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:28 crc kubenswrapper[4790]: E0313 21:32:28.661110 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:39 crc kubenswrapper[4790]: I0313 21:32:39.667508 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:39 crc kubenswrapper[4790]: E0313 21:32:39.668336 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:51 crc kubenswrapper[4790]: I0313 21:32:51.659668 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:51 crc kubenswrapper[4790]: E0313 21:32:51.660435 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:05 crc kubenswrapper[4790]: I0313 21:33:05.660620 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:05 crc kubenswrapper[4790]: E0313 21:33:05.661623 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:16 crc kubenswrapper[4790]: I0313 21:33:16.659863 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:16 crc kubenswrapper[4790]: E0313 21:33:16.660656 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:27 crc kubenswrapper[4790]: I0313 21:33:27.659668 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:27 crc kubenswrapper[4790]: E0313 21:33:27.660477 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:39 crc kubenswrapper[4790]: I0313 21:33:39.665653 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:39 crc kubenswrapper[4790]: E0313 21:33:39.666360 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:53 crc kubenswrapper[4790]: I0313 21:33:53.660712 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:53 crc kubenswrapper[4790]: E0313 21:33:53.661792 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.141273 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557294-pd2t2"] Mar 13 21:34:00 crc kubenswrapper[4790]: E0313 21:34:00.142091 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa77812-592b-430d-b5db-15fab10e53e4" containerName="oc" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.142114 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa77812-592b-430d-b5db-15fab10e53e4" containerName="oc" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.142370 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa77812-592b-430d-b5db-15fab10e53e4" containerName="oc" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.143061 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.146155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.146686 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.146944 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.151504 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557294-pd2t2"] Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.281511 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"auto-csr-approver-29557294-pd2t2\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.384569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"auto-csr-approver-29557294-pd2t2\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.404470 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"auto-csr-approver-29557294-pd2t2\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.465450 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.923184 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557294-pd2t2"] Mar 13 21:34:00 crc kubenswrapper[4790]: W0313 21:34:00.928108 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod552e6693_6bb3_4722_8be8_4ed07c6c3953.slice/crio-86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c WatchSource:0}: Error finding container 86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c: Status 404 returned error can't find the container with id 86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c Mar 13 21:34:01 crc kubenswrapper[4790]: I0313 21:34:01.036979 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" event={"ID":"552e6693-6bb3-4722-8be8-4ed07c6c3953","Type":"ContainerStarted","Data":"86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c"} Mar 13 21:34:03 crc kubenswrapper[4790]: I0313 21:34:03.057515 4790 generic.go:334] "Generic (PLEG): container finished" podID="552e6693-6bb3-4722-8be8-4ed07c6c3953" containerID="2c6aa1facaa53aecef5b73114336558c2aa3bd5a0b9119d7519041f577dcea9e" exitCode=0 Mar 13 21:34:03 crc kubenswrapper[4790]: I0313 21:34:03.057612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" event={"ID":"552e6693-6bb3-4722-8be8-4ed07c6c3953","Type":"ContainerDied","Data":"2c6aa1facaa53aecef5b73114336558c2aa3bd5a0b9119d7519041f577dcea9e"} Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.509359 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.659891 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.659941 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"552e6693-6bb3-4722-8be8-4ed07c6c3953\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " Mar 13 21:34:04 crc kubenswrapper[4790]: E0313 21:34:04.660304 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.666541 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt" (OuterVolumeSpecName: "kube-api-access-j4rzt") pod "552e6693-6bb3-4722-8be8-4ed07c6c3953" (UID: "552e6693-6bb3-4722-8be8-4ed07c6c3953"). InnerVolumeSpecName "kube-api-access-j4rzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.762574 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") on node \"crc\" DevicePath \"\"" Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.075174 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" event={"ID":"552e6693-6bb3-4722-8be8-4ed07c6c3953","Type":"ContainerDied","Data":"86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c"} Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.075500 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c" Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.075639 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.576895 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.586461 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.668860 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" path="/var/lib/kubelet/pods/4d3de3f1-0534-4203-b465-d512d6c80287/volumes" Mar 13 21:34:19 crc kubenswrapper[4790]: I0313 21:34:19.666880 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:34:20 crc kubenswrapper[4790]: I0313 21:34:20.192805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"7b5f1e5b820d638552a16c09dc9dbbd33ace522261e8adb6296689912e8ae35c"} Mar 13 21:34:20 crc kubenswrapper[4790]: I0313 21:34:20.241892 4790 scope.go:117] "RemoveContainer" containerID="c265de87623abb9a96ed933e22a3276547bc13888411d097434621497cc49ed1"